var/home/core/zuul-output/0000755000175000017500000000000015153756735014546 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015153766762015513 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000340145215153766711020273 0ustar corecoreikubelet.log]o[=r+Br t-n{(!9%CM/c;b[>Ǧ(ZΑ4cڜ)y&g?k8KtUgfX]M͊wqɻlOv} ow ^|6Jr_>c^*߶Y٬:|fu<ۭ_x~̎+ޜ/8_poL_bڞֻ];YoZO(_-V,<xnƙQʀClxv< |N ?%5$.zٶ'p~U Pm,UTV̙UΞg\ Ӵ-$}.U빱އ0* TQ0Z%bob  oHI\o.f/M1FHdl!X?_sdxό?2$&ug"Y%\ߘfDP'F%Ab*d@e˛H,7љ:72 2> ƴ>70tr>PYND'vtGoI¢7}o٬ovko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;ϗu7Taqu5ՈӄdF{ "IΩ?PF~J~;Abn q1J# !8,$RNI j!bE" j/\Y`r"hA ódvYOļU[ Z.ߖtHp(-J C?<:zR{܃ lL6_OފƧe7ΣԟRxxXԨkJ[8 ";ЗH F=z܇sθm@%*G9qvD])X&;cѻS0I٘]_fy tt(GVTB%ob܊Ud +G+ ;!@G8s%7ל'scX@ .9w%9&MGiW7VP΋b؟5g.wk"5-՜c>QviYDZh$#*)e\W$IAT;S0Gp}=)ڠedۜ+EaH#QtDV:ԿW#w4r_ھ8ZJ%PgS!]]6ߜQZ݇~:R€ N0RQ.FkyZ< O)VCRQrC|}nv?R?Q~}9 ;]-Oľ9v%T&hoP~(*טj-dߛ_Q?[kLd. yK>"dg{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjOh}nL;R:7A}Ss8 N'ԗڲ7J9@ kV%'DG.b.~,%6~;GqE,[pJ82D:BCtka7v Ө⸇~AE6xdv?Dr/0;|!B`0p1y6PM3rr1TZ')*R ,k4΢2KkBxjWNK0[EΰPaySw)} hP(d#iI@YUXPKL:3LVY~,bW;W8QufiŒSq3<uqMQhiae̱F+,~E/n3 09WAu@>4Cr+O\])fǶy{0$S7:z4efb#hQ #_ފH&z!HAd |}p TRi*KsmM+1 P0W YW ].PK%Mj˫-Kp`zbbq$7&&{Ldrǒ*!;[9@M:C{Sۈٟ%Lԏ6Ӿ S]-.&:,,$.Ɏ\`UXSiZj 6&:$)8щc.֐wE^lKKiw ڍ[I?TPht~_ ^?n3$ƍ7ͱ?9].ۿ뺶ypy͟מs{(99x9O6]tGLS0l/LOKcQ.os2% t)Eh~2p cL1%'4-1þh[;:>OM=y)֖[Sm5+_'cjf `~ߛUIȏvl.4`P{h056 9wo ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !Q[;4j39]WiZSس:$37}o$[4x<Q IJipqc2*;Byݝ0_5bZ8ւ 6{Sf觋-V=Oߖm!6j< f`mPіpNЦXn6g5m 7aTcTA,} q:|CBpuFȆx6ڮܷnZ8dsMS^HэUlq 8\C[n膗:68DkM\7"Ǻzfbx]ۮC=1ÓOv$sY6eX%]Y{⦁# &SlM'iMJ았 t% ~@1c@K?k^rEXws zz.8`hiPܮbC7~n b?`CtjT6l>X+,Qb5ȳp`FMeXÅ0+!86{V5y8 M`_Uw ȇkU]a[.D}"\I5/1o٩|U戻,6t錳"EFk:ZM/!ݛ@pRu Iヵvyne 0= g`_w\|8Fjȡstuf%Plx3nvOσ =?6ͪ)Bppًu_wm/0}T>CUX\!xl=ZVM\aٟ6h㗶E۶{O#X26.Fٱq1M k'JE%"2.*""]8yܑ4> >X1 smD) ̙T~T,Vv{mxY}SRL-by-a3&(!F)ϋ]@`<ۃ7|qkG402|2d5K: `Bcz|YץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} 3V6UݎvxyRC%ƚq5Щۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+weaxxOq:y|ק_;%X6Q@d 8&a)a.#ۿD> vfA{$g ăyd) SK?ɧonxlM?~n Θ5 ӂxzPMcVQ@ӤomY42nrQ\'"P؝J7g+#!k{paqTԫ?~VX–8&w@a`@/t[Edso\wz|In;3&'v]gخO)0{ zz2 堛 OwMm[eG`̵E$uLrk-$_{$# $B*hN/ٟ㭗R==9!nKErHc1FYbQ F;v?ob-ڈFalG*rEX}HAP'Hҷ$qM9(AHx!AF 26qxCdP!NZgҽ9l*(H Žڒ;̼|%D Ɖ`Pj . ֈ,ixp`ttOKBDޙ''aLA2s0(G2E<I:xsB.ȼ*d42I:<ŋu#~us{dW<2~sQ37.&lOľu74c?MՏړ@ -N*CB=i3,qjGkUտu6k Cb8hs&sM@-=X(i7=@He%ISd$&iA|i MiʏݸT{r[j顒x.Ƞ"m@Hy_I )j|s#RGI!dTKL&4K>#stV \'xMgaSZNg8>e!^f%cYr]qs:"̊;isXa]d+"v=x7p.fZCg_Ys;pE&\U}ܫSh])qKYAـhhdEnU14&G * QIQs;rԩ.k83֖8Muqu_48dHܥlWW q>fu6+'}xu\Veelz`Zbym gp8펠ˋֆ:1IC8qٞ\vXçL ]X/r}7O}Wh,h ;RQ=]u00yiC۔I^3!?H6iUH:ô 4P$rT`%2Aq-֢׍qt=@x#~0)p# ы9'iri]ͪ/@繁qVGCڤr,DihB ,m 9 _$q3= A$IC"6g^4e`Xo(D*6"^eTh'4xpFڜe'fVQ7~'c L^ԯwIڣA.}H;Ë*׬=`^ 9]r鐃 -Dfi2|QwZk‹u^6DQ1&H凎c!n[mi3)WfsF:M"uҷs.1!뾧1%s,hQs|hx̗3%*v9(I;:'>uQ+v)vR/egBhAǡG6; KE.a\}:<]Oyve3h9}E[kMD,5 %sO{킒 8.K?]i/`׎tp NvԻV4|<{H@#*h{Yp/E%dlh\bU:E%h@&SEK [ Ƣ xg{z%ǻViX~鮦w35QE~qp[ʕ@}ZL! Z0!A⼏q)[f &E1K3i+`JG P/EG 4 9LڑKL|`PОnG#|}qOR{Q|2_tH߫%pD?1%(@nfxOrs25rMլf{sk7݇fjӞh2HkeL'Wʿ}Ƞ%>9cSH|cEyQp 'ˢd:,v-us"Iidw>%zM@9IqrGq:&_p3õB!>9'0LL]M[lwWVR9I5YpVgtuZfG{RoZr3ٮr;wW:͋nqCRu1y=㊻Ij z[|W%q0 CJV٨3,ib{eH7 mҝ(3ɏO/̗-=OR\dIoHZ6n`R֑&#.Mv0vԬ]I˟vrK}F9X|FI#g.Gi)%!iK|o}|ֵ7!ېATJKB2Z/"BfB(gdj۸=}'),-iX'|M2roK\e5Pt:*qSH PgƉU'VKξ ,!3`˞t1Rx}fvvPXdQSg6EDT:dׁz^DjXp͇G|X5Q9K$)U?o': .,wؓaՁ_ 3]Q16ZYafuvrq^ѷQT},!H]6{Jw>%wK{)rH+"B4H7-]r}7v8|׾~Us?yWfv3>xpRҧH-EeJ~4YIozi:nq Vq8swHOzf ̙eX-4`TDGq G.tݻgq74ŠqBFf8 9Fk Afq#ϛa$!qNCJ4bnvB @W,v&- 6wCBjxk9ᤉ ,Asy3YޜZ4ΓVYf'h?kNg?҆8oC!IMo:^G10EY↘H:L@D+dˠUHs[hiҕ|֏G/G`' m5p|:9U8PZ7Yݷ/7cs=v{lLHqyXR iE^1x5/[O6rpP40ޢE_A͝ Z5 om2p)lbp/bj_d{R\' 礅_}=\:Nb{}IStgq$<$ilb)n&  $uT{wD]2cM(%YjDktByxVl巳1~jpd1O9Á%˧Byd}gs9QNʟ. /ӦxbHHAni5(~p>/O0vEWZ nY3 cU $O,iLacoW1/W=-kqb>&IL6i}^^XpCŋ݃k-$pxbڲ&6*9mg>{rtD)wQ`pkKyt1?[ˋZ5NhfӛŮ Qu8Y4?W֫/&W˸~%pqq{% ?K~,#/0'NZ׽Kq^ėSJ6#j8GO[ PCbʍN^XS&}E9OZ]'t$=tnn&nu [}Ab4 +OLuU{0fIb { O݂9x 8ퟍ޹na4p9/B@Dvܫs;/f֚Znϻ-׭dֲcUh=Ɩ9b&2} -/f;M.~dhÓ5¨LIa6PnzɗBQiG'CXt!*<0U-(qc;}*CiKe@p&Em&x!i6ٱ˭K& FCfJ9%ٕQ·BD-]R1#]TROr}S [;Zcq6xMY 6seAU9c>Xf~TTX)QӅtӚe~=WtX-sJb?U'3X7J4l+Cj%LPFxŰAVG Y%.9Vnd8? ǫjU3k%E)OD:"Ϳ%E)=}l/'O"Q_4ILAٍKK7'lWQVm0c:%UEhZ].1lcazn2ͦ_DQP/2 re%_bR~r9_7*vrv |S.Z!rV%¢EN$i^B^rX؆ z1ǡXtiK`uk&LO./!Z&p:ˏ!_B{{s1>"=b'K=}|+: :8au"N@#=Ugzy]sTv||Aec Xi.gL'—Ʃb4AUqػ< &}BIrwZ\"t%>6ES5oaPqobb,v 2w s1,jX4W->L!NUy*Gݓ KmmlTbc[O`uxOp  |T!|ik3cL_ AvG i\fs$<;uI\XAV{ˍlJsŅjЙNhwfG8>Vڇg18 O3E*dt:|X`Z)|z&V*"9U_R=Wd<)tc(߯)Y]g5>.1C( .K3g&_P9&`|8|Ldl?6o AMҪ1EzyNAtRuxyn\]q_ߍ&zk.)Eu{_rjuWݚ;*6mMq!R{QWR=oVbmyanUn.Uqsy.?W8 r[zW*8nؿ[;vmcoW]"U;gm>?Z֒Z6`!2XY]-Zcp˿˘ɲ}MV<в~!?YXV+lx)RRfb-I7p)3XɯEr^,bfbKJ'@hX><[@ ,&,]$*բk-Yv5 '1T9!(*t 0'b@񲱥-kc6VnR0h& 0Z|ђ8 CGV[4xIIWN?Yt>lf@ Vi`D~ڇŁQLLkY <ZPKoma_u` !>Z;3F\dEB n+0Z ?&s{ 6(E|<ޭLk1Yn(F!%sx]>CTl9"و5 |ݹր|/#.w0ޒx"khD?O`-9C| &8֨O8VH5uH)28 Ǿ-R9~ +#e;U6]aD6Xzqd5y n';)VKL]O@b OIAG Lmc 2;\d˽$Mu>WmCEQuabAJ;`uy-u.M>9VsWٔo RS`S#m8k;(WAXq 8@+S@+' 8U˜z+ZU;=eTtX->9U-q .AV/|\ǔ%&$]1YINJ2]:a0OWvI.O6xMY0/M$ *s5x{gsəL3{$)ՆbG(}1wt!wVf;I&Xi43غgR 6 ݩJ$)}Ta@ nS*X#r#v6*;WJ-_@q.+?DK១btMp1 1Gȩ f,M`,Lr6E} m"8_SK$_#O;V 7=xLOu-ȹ2NKLjp*: 'SasyrFrcC0 ѱ LKV:U} -:U8t[=EAV$=i[mhm"roe5jqf$i>;V0eOޞ4ccc2J1TN.7q;"sդSP) 0v3-)-ٕAg"pZ: "ka+n!e߮lɹL V3Os\ဝ+A= 2䣔AzG\ ` \vc"Kj61O Px"3Pc /' PW*3GX liWv-6W&)cX |]O;C%8@*Z1%8Gk@5^NtY"Fbi8D'+_1&1 7U^k6v읨gQ`LRx+I&s5Www` q:cdʰ H`X;"}B=-/M~C>''1R[sdJm RD3Q{)bJatdq>*Ct/GǍ-`2:u)"\**dPdvc& HwMlF@a5`+F>ΰ-q>0*s%Q)L>$ćYV\dsEGز/:ٕycZtO 2ze31cDB/eWy!A/V4cbpWaPBIpqS<(lȣ'3K?e Z?ڠ8VSZM}pnqL f2D?mzq*a[~;DY〩b𻾋-]f8dBմVs6傊zF"daeY(R+q%sor|.v\sfa:TX%;3Xl= \k>kqBbB;t@/Cԍ)Ga[ r=nl-w/38ѮI*/=2!j\FW+[3=`BZWX Zd>t*Uǖ\*Fu6Y3[yBPj|LcwaIuR;uݷ㺾|47ߍeys=.EinE% 1zY\+͕߬VͭW_겼cazyU1wOw)Ǽn@6 |lk'Z|VZpsqL5 څB}>u)^v~,󿴝} 3+m𢛲Pz_Sp2auQAP*tLnIXA6L7 8UgKdT)*7>p{Pgi-b)>U6IXabPde Ӽ8Ģ8GɄnb'G ֤Mcv4?>HC78NE@UMc8>`TvZ:}O KT1Z?mi#_KeN@7I{zml؃"HF$g;dN=/p4ΐyGIXDeÝOP &ol$OQbF-4 ?T3)p-<D!xTj("*殴"rq`#ZaTꢬ֟ì=I/A%242T<6Dȷ D$zu0TXud?nZy\&F)3TBcZz2(Sn7v M utU V`0'z?10ΓjJpeR.+~yt=L3X;qv5T D3.)ʶ]RtOlѮC:K}Fs_\Kg/,G/K* EyYRax?Tw;o|/K*1Z˒iu8kawpԔKBniʹcߗ`\Ѩj=HQ<+~9x1q|9YɰW|^\ݡj%yO0l=mLVOmwdMًL/Md_2co/7# /&73B/>oͿY,C껮O>983>9 s]٭ 1mpCst쀙R] sh|5]k_j-y*@65FM!}Jz+|2\i__X Tj&YQ ZQ*+`@k5`M*ϿmE\b_J%$>^#K{\;ћ7'OZI☿TJ~]} *IJ11:@%E.Ln;%2$ c]?Z8ubIyV74gF}+žܴ EiF$2aeuN~/e.ކSZRm{Lyg/Zˋ(U?3ڄ4f@sww>>8qvi Bp gOm= i<<0v`uSk-dwLݕ3\LMonRDlcrFSpٻ7' L]HNN=f,(I_]&ሕe^SN߲sTT5ZG,ٱ`6zwi9\OVE-ZV/W_ꊦuu" f]oI& d%b`h((R<}s㰣m7n <`MX*ykkwMYSSkgEޕMIn 482:k9L}("!,6mә gp캣Ux$>V@KhĢwAj~26CuLI H>4ҕм +#:ODz)k x0gYOj.ħ|`}P(]<8iLzHQTWFgH'OmHx{ yܳ$lI|? oKDKi"ʜ2KO0!ԮI&>BuSXXj_ـ>C|7kf8iwr\'OEղflg^"ϒHeiyV,64._=eX[Q"g3x94e/d+py O <[v2OSq yx -u Y%ljBA]=$V,UʬE @[(3`Mq:SW2PŤ<`*M<=+`_yK.(JAWׅЪeԾɎLYt̪`s0#UuTi .r-eZPŗZl҉׽Ҭ+Clj.]EҐ0F?y٬.\Op_eƍY%XLuV`܈cmEU+=eK[,oFA#8M D!jXl.xۇ𸞾%ZnR!4|{vrH<K6e3tpi^#3m FX8C-[/RhAhPսgp]e'.ч\n۱xB,e,W_e'm3q Rmom8"&L 0hE\ Ʋ$_UE$/Nb8㸇+Ȳ[oZݎ$<8wkgRukRiAWߗNڃٓe~Aty%ؒ%'M׈ޮH{s>_h:)gz=O\D;40 #qg@ ,=iN\;]ױ!,߭Y0r q&Lw&0q1MzmThXDjK\?V]y[M'Kr6%NߵG0s~n :uYF`h2saa]! G^RQYVE%#dܤ4Xk=6D- bEt z7fi"&A#MxBO.UA -qHy_vz|֬oQ\;M!ۆ1|N-U]0ܣCxBú݇H\ t)HGslGKh׽2DrKv_"u tٻ޶lWvqPrµhx@IHɲ+yYEkwμΜ1ymDYʧI|2p5p>..i'tbӬ~&m?"~}9=l|h[U@p )\kUu*+5_84 q/x1a_{آ :U??QpT*OĪGN$X簞It+д7:NYo X X^*}Ÿ{vBG?o0 aҹz{v F睝wCʁŽ%! !K{4U؋L1 {q;YbwF(LUwuFNG+ u& 10W AGx㸲Xk5lL]LSPwxýir퀓y[ CH~Lxxΰ۫jFO#:ꆸE7JWN@8]E@N7ܷJЕc}) %)rW@:*\vXDH؍d !puʼn#WF vX>,x߁;a~s']#Ԉ`x⇺x'M4X}`^api3tZZ%h(s~ 4n!wDЅGw;q9F y;_^ِwJHu  n^q vc ^:BuO&T5pjAg,a?nAnGWv![NNFҹHocR"%  : ѧ xhV0,S y,ר[\iv2 `DOЕ"||} aOZ7ٴ}f^,Kz3u乮/>)zI-[zگ>Za̒)a!1/q\*$Kf/>J1,} R7<1=Zo0V1 y W )0)O-#Z"g'v(fx6Np]v:k;H׃{j%QE5 8:F^ 9QJ(#J~rKl<Vd?HKrK(tp놜]ߌ 8ZqV<k>(<[n 4Чh<|> ,Q|ok|,t 4܁x|:aId"I8 xizgv#UM~|Ҥxj j:_Ƃ%Zi 4q04 ip4O=YyuT`?5M 6>`#f17#@rH=P)F]MҢ<9pBѐw艿W gB*di&=nW%ٸ(1v2gpT{es`AH%ĝydXD_;t*NpMfoϊ;Թ ,œi~;+#è`cKZv8 (< E6/,!rC.M0X;vi aEJ&ay=Um_ rB/GBK0<' N< [[F$Qh9{6~oc0pxb"j8Íi뙱,<[H7 `Ɖ0EꟲRX&9B}wYflBSK5 F?SrTF_Q$ɨD\86ErsEb 6+ w&ܠ/s)a &r(F!Ξ7*)T #!"`xn6:ݤ_=*TT&ʮRǗCF :nh V;j}>sxΟk4A>-cgA䇰pvb*{<]j'? 収Zq[%I?t91'Q?|(\aޝ,$mDPOɇ2S odm>6E :X[M\vA/'Kr]NxۺTT (|_q5eB e#mI(ۄЕ!e)bP3Gm;lq]؍Fv}n=._o@,ߜV9G<8>1UjE]<򑴦1t@ͱrHZI UMWjIq?q*KYfZtXLk`*CZj1j!H52%Xk0ka/Oac]$Д_;b,A 񐮎D9.[UN{YzeC$IU4j'Y?Uj(LTh YňԀT’z!{nK)9agsi@ۍ 0> vUk_KgdaD^#@r/7Z\ Zp.ҺcK8)Nm\N.DSHM9 a(kђ:\:/nA+qXbgpBۃXa=a&.*w|Ʃb2sdxKDxa@"um/HE*z}"?}Y~]sS29(g:P F2-'"oȗ\GEA/CݿPH:c&sN$q3M`5?ߒ$bDⵂeӨ%*ozũ.-l >c$=Cd@Ϫ$c?ok$.xkana0ߘnq֗)Q-V6ׯ;ȆbfL$PξׄnIKb d ̦"O&1R6ĂA +30|ʒ>xW^%UnGSI]\jJpІ]n*O iW,ФvX3 `cg<ӵ6Qŀ\Liy\Ko+8~3s+R{f4,nR59 I6IexEj}XY (d0/+X~>TR]]WjIQ/XVZd)M6{D |ܴ.z~Y}Þ/yu9= ST1(v$=TVٕZej Hv9@Ck)⪫WptAa){ʛ f U 00/1ϕ2+:R|&OuḚh_X%Qo]u*è~T]hl0&Y AXr%O^𥳐*X1"'U0. w* n.i!iy'/(ݫRK<=|n>Z}m.J,3>Z&6U&i\o8|Œ߷j͐蹢d6j@)kRh ,εo^bh8 (aǞIyOD¥lnJA+S1G 4Pn@:w4kvD\RZ1QdGYePkn-vM-Ry5F&PsOW(&יpt~b On|98AG4H ڳ"2"AԹZrkGm/knF‘dlU4)Y7rO\n'&֎"+< XOsmM ݑWddv.-F`ߕ9q|Zg6=ʲ)z"c>*Gܾt??y֑djY`1} 2D"el[}II7]NwLb/TW?M?T,䯯jOZ%Y?޼1a9~4\eX^.eWzlmXj`1s׍\_^Ogj%YՏ<ï]S)n~ƿ [|;\7n //\\yr=O\?Wm<=6-7mi0[ϕ?Vk~55^ay5}f??-k]' Ѧכ~,-ʹw=^xޙ=2Ϟd/?ܗAlogZ݁o"&wōB-I7M8& WDZ.nm6TX\hVN+xw`={OކrےnSͤ< Sӳ1Q#S>I3͜gDjJ&G WКa`vB肧0Ct'sP}椿{%xR0t>gG5/EFb V /U }/7O)`jJ~2킡gM- 8?9oE*"JÐb aGgc,<|Z.aRGHI[ 9F\~ByY{Y8kWuB%R%7V}xxJkX3fpTYQX89!*q_Ā1IL,A [o࠯A;{X-18^@c9U'[P,`d֗Z?6_$Jc`Bj#hIw>%vnp臮B` AMM-&LHt,Ȥss:1:-QZ:HPh8mЗ QtNDa9زMLQxT YID7=hv\r`HӃ1\!br]cr5Z>m!r _fCG-  WQGUhGb V7 . \|+ %\}J%Oʑz{%86t^ǃ9N%q ܀/ hoJ޾o ^Ƙw.7"/9xx&C23} J. @ dLjFzS,u<.<;"Ǒ*4zE{TIY\@1FC? q>JVl|NtQ Uj3j7tՀU.DÑb8_nn(`V* .dd~k8:(9YɃ@@0K/Bcܮh7G)sjKQ:ӏȜU惃RsSϭHS>#`{ p%\*S )߆V4W__s5@yfw/P%&Y"8 5xV1;CԵC7O^;핊zEسy I]#owݝs v,ra"؎zۚ5F)&*Gd'K3ܙj~wt^wbpH8Xdj>pn[)a n_q>RZZ }"@I[gɵsBbbpي삣:eV;G9\7i1M9#uD˻.ybѩ}3JkFIXɧ"[ 1=C\ KǕmyw+7 [W7]p,y?9PàpBHS9ݢo9~KG?+ըT#E8|C{{%M/.!4z7="t߽=_s}p{;&?+\r,8z|p>ҍ(Ku/Rݚ [Gwsjp_-~栒7 mnQ! `F4)pܤ5vqSCyjܧ4.E WDG*EWe`"cE2{Njs.YmoZ󒙚\)2SovI]k/{evKiv|z!/tݚ+l4D>dk(zSED( "=%qݴX!F%w:J`S5fG .Ml%C^NYC%z>Bv }k b& "?J>>yߩha8;dʸV f^$e&_⩷j%ʭۥmn̶KɜWV?'t Hj|ӁK܅x2gJ=xW=,K \"R<jRϕ;TPPr9ZA<>>8Z4R*f{<'r: PO8|1Y"SX:nU_o0#N[Hf(K`}F$w;ϪxV7 #JU  wMh@ׅ>ٷؙߣ6r}80zD멍g6q?]ۣǴ*=Ñ}>>uq^ՙfz{nv^_mW9"CPPr 8m@n7 kҶ_0O7'ûa~ӛ,q}X~r?{^& P()Q%_KƍFJM;؊N[NYcX޷;&%SIK|`FsS{Na@F'Sivgڂ  * +5T~8=bG¨&Yy-[5eF\K=3jT^&*gIvʚ{u7Xi4ڒ\( VW-Hׯ3 ޥ{%Bc'ԃY 1n]pVw]p5GҔI rJgp_>Jl!'Hqsڼ+=Tf>H>}MK}N1Ñ2ShGeecttZ@ZstHK, Ia-3(vw6i\+ETGcLJD[mwOOT6u(`{#,ܗp*R8t$jtsƚ<7_e{Y62TKii`N2xӛ8`Brj%طGTQq^b+u(H()f_sq󿫹3wkeDz,rӄ<8uleQ?wq0&zd&itZxmnĴ O;G.Y g&Jywج!2!* MSӰ)0}TFRj鏝be 8 GU@+[7eq NG1ͪ9]sjVj EH}RtxYl=PZ}dc W3GqWSrM-Ľ*mԊ~#wٳ"5+8t ޜl*fv-̝Ul\Z R+\ZNjrp+ΩuNq7aLT0:I `}.I0V9-MVn?Yu)S^ش^E‡ꃣ<{\UصX~ IBG-bv>t_tYuF(QWR>ԜRX@tr]]HX)qV " */K X̤MTl;xX/W3ZnYW1!i%3ѧ6~_t/qg [ߕ.4+/NZMRPxBf$RtO112xoW[ bePd`]LE0h*:$-J/zՑh/^ػ6neW6_ E'0Iap %_=]ZN$]cg3ϐCrǵK盙S32k՟O.N *֎fTw4!|hjͽ,Tj૝O=0g\Ov.gF#!Z@0R+0aeVyYK!{9kd=_w]ψ=g%~`yҺzc@|EG ^b ?3Ϸ:JO PTu=3=k!d ͇_JE_v>=o%ֳ2y|i{e dEs8#~hʁW19\ 3{q|"Thv>&q\'}3$JV)ye.ta:}me5 ͕ &`ַ|a4(u:.?tHŠ|ć?7/j`d]L&WQACQǂ1?8@_}?_''ǾM 'w\8'^XC,!Y?D=6q(bf~^\z<gf} '3w`~Ws2ǫ+hҺ3IƐXⱋ}0:`?Ozwy~?z7 !.? '$);_qꫦc&.E{i i1jB\yu myз'hoYیD#YǨJVV3KYBD2 =7Ÿ$cQ9h0= 1ַEhmmޤ 8;*r!Zg,bۘ׼xcH+i'_>mB6LLƣˆ!]q&Ռ)7)@AߡᴵCmf76!9&I o];.X¼}r,rMi=.Ѽ%`s2 Qm1л@ط@\ٺy͝nĜoͯg7 iµVfµAƒmUwsk9]MqffvW{"r|Q8ն_3,X"Ә2ML^S P5437Z\4D͏#xۊZ0}oH=O0"NO{EâAN]A70t1YM`\{3K# Z_˱GKL5^h?=9~ Mᱣ}PeNGa$f%&Q`βIL撘`m ${QA^qsc DUIq4n*DKy:Bsp g*",WAWth AbǮb$*DJ!$/?@R)T^֠"D"&*jW<y70y>ao`3dF 9Y7pDhVyÆ)oDXLB Ws+5sJtF؇S^s0Mxٳ$f5$%9Q30;>Qw!ZpF5hA\X>F%[i2V?0]}&Mѕ+Etuڮu]_e1.?)kjFպ>T%oDnf/J650bM '@R}h  RkdÁՖYƃ2WMVSw{T _ބtrQ*uqG381.@*ֿs;w+"MHOoqȯ~ׯn5C퉽mS)!ڙ~߻وϘx+eiI>]ܦyz+TXg!gJM&rMsSuoNc_ڌs>jb]j"ݲMf{R:Y!((^ 幠sV ^Ѹʡ:"R'Jmq  :zr&kg)haA0Վz4VH-Km0dV+%gMNKZW[*X=e8ϛ0Bxv!mk?cWsZmJt8#љVcKkb-Rݵx.kYhK,P)Q9V9e&*ZUՃZ"].+I)KI&>p%XlqiI !|J.; m4-7HOV9p1&0!& JwTV-CzGоPӵuюt܃hW#&W?З@P)<<6B$|p~NN'ȶ4pO~jN33.A*ZWl? 0P[ND ]ϱ|ۦL%|W>UD8%tk9G['oz_u"$>N љ"­aUkcwJsܩU w5ARd~?u8[_VicA>ET>ɅouZK<2dǶUmSip%$]C()$\&<'*fڗpTr'QX2R;JRKʺV#b+yPZ ؾ pS,/ &0 51rpNmũ%Yqjb Os|?go}:py*JZ7>]5o]:W5KVR+`x,S}GmjGI6` 57;`@s!x,)B`j`k8*UpJ;Jc%?8г?fU 2i?EF9]fLEi׸z~ FO|ڏ~nYP[ϯA)֎uP֫."yWWD)޺zMo]MFc0͖jj|SJW Jo}~[ŒCމrp8ޙvQ\GK: t\[tI@ d"NU7"TW4ZA?9}H*Y8*)Ej5v`={.JCa0$d\ÝEesnmֹ,7g? b eq)AkJJIJnDP^IW1n`1OHN@4`[z6\^ʃV,JWJ[r #֖UABKx O$̔!Cӈb<5@SRXHj5~.(b^RcQ200ޖZ`8NFpt%[50h%zy2IF$;1" H%E)&RH4KU6$GSȭUeK!D`̬0Dq^IƔi]"&D40-tZ wDܞ B(#[ !m%>D2b't7O Nw@jlTd0ۯ=a/cu6 ,UE@'"Du#K`X=vTBAп']^g_NJEpoJ[<w~50UQZįiv)E%="k~wTJPqDhɉl(\\2(;,^߳ ϷLe8zgջ7>^v*ACe ո1Z#r7ĥ>13Y$‡yQ\ȔҀ.%Gz̈q̛PhU)*X3̭OUxyl8 dR1$r<)i):w0bNx[|w*##OjIppҔ%aԌA$1'RJ ;H\૏R Ь\df bl\FX= r偣[YcXBg ^rDS<8 )^#S2ɒk8 \Rc=qgf JȈfhqD@)~j" 42"ɰB#Цt:o #_*-$ ~ST@3*NaL0'N(`6#-=‚ Dl8 #K 7Hj4S() M5M,iGxT@;&@N(1,G:#,RF+`D M#8H Fz'׆A(H+FXYc +CDP v+Yr-G U  rKD3شx[A10 mRi%[Nļ<X Q}k׽`К ´\m9wO BMz*"g%e/@]I{MngDQ5#<浼^YT3-fs\b(~R VXūLK%wz{Ǝw=M;%zdtݻ|uo|dDP3TWܼA# 8yE{+~O=@K֛瓩kމFu.ޗtS2ߙ_*o>?>R_O]=Iww^zҧZz+}o'Y瞧{.Vh4~"뉳^(h!r" %=t~5&6\E 3\]-|Dn_ٮV0s0!ImjŔ"'͑,$OR4gcYį/ٵvPliuSޔd9o< n:Pm0 ]֭_G/ ۡXқϹ_cwqecM/E\%I4td_Qy>O(ϩӞ;m7<1ǖ|WwUE^<A3< gƙ4׼ML ּ5Lmz ZЪn ^ B5=#*.,7[n~ ;Qiťi*) AM4Z־ +;YkEF= qζ\X- s | B!ˡ$h9:I@J} @+߼ 6533׶f#kFgFa:0Jdْ gH浜~8a=KBwH+ŬHC-֭T,Dexyg4+P3^fToƞhTyY+yPV >d9oX8 _ܓ۪R pN;r*j)V3Ů;zv_΃j(4`w@aiglX{Y+KۺJc^ssVq 1).T QҎwa!JZIs |s!Veo6%R#n]c.`]Xdt7|OTly6 f[ͶU^fOg0yˡYeF<(jNN [ EMHjm9ar/K& uzuB ,U9"s6|.j+^NV%r* R,ըg; bd[<<]ە4J_Ւ7qvPs:ڕQAX-E!9ZfxY䋗ex~7*L:?U-j1tͺoH4!qZ7Lr-eMؚi4rt}|M9rZ~._)|!\pi/.Ҏj(n= F@50g.fp)>lvl)v'wXDxIjy Ni5sHObŖ9ڥ-hhI撀Ss@1"5,g}2?ͳy)$N_/ql9Nʛ@" =AWLn0A_>wHj Ә\ fmDV@0ܾ9@fR.r ǵOBJ!IrUtR,t7G'GW[qd]0V0mD6~xiUWzWEd4-7MF򾕐: ut~j4 .sH5LrnQ9RE΃kR"_Tk%qL(:S9,L Ϯ0 >h{|PRcQm 5nH33 A.˵%?Tl"U)˄\lmB 42*͍aC0U*ޠz[z){TɱC dPAT mTAsbzmqiD`DPAO I6r!$tV2ToTɗmHEbIF ToKU6Jk@vT8To+UFa6Roj!aZoPmUhjj0|1#r<8R  BoC\ 5fLфlT@ zm%)hنT,8#U0X Ws zL]9LX8ToU RӼC UZCPAUjw|˱^4 ,6eLjTT`-v.j5ose(yN/$MjaGx24 GeRs#^^Gi ֕pTU 2HyeFZ45$0RunLqϟ"f59:nߖ,2㓽KBp]r-6Ɋiv>JƧS^e$1JD(sr&N(ʹJD(@MTL̺W4ӇޘErQAWe\cM5)w5F!] Fk%tVs`|e~\%e~Z}yZZF+U*x*j+x8iԚGYƅ`)8, D&J3!BO~$:ў=W MR2!%RJ)Qnrp8$螺&C.k -nBCTR+!L[۞OLXqB~6?w$̵iXU^ۙV8ؓ%{=́¶mwG"]2Q* Yڪd c־KL5}SC% [X8?V 0c5F!! (y^ɵ4ZZBAxK &(TQ eغ*e,2WA@[@]jkk!r|{EhDXݟ{EcR(]ȭZ8,3@F`YʡU\r]aeYY,eP[^!* CaKa5C|&e֘Z;p Rj9 etE,,lMlߚBɰ5jx )9S9 M-,3D!I 7*F 7^.7;0y(p m!h"V%^mc(<2c A *)R#,bXNoGҵHZ}c7},T-H8&hAr֎hee,\=~n956" ˩ŤB&8ATPMF'ppfY ĠK<&h:[_n~@:(BSk}P(1Hw:+$`O.ӫ*৑2J aᷬj 7kDMX UEu K⎨fu:ħ P (tʈ(/Fk!%q:ģPXv巅Eaaa ]U6})aR5ltrJ|z- 6uB8_WhǑK: ͒=G-t"@gU{0r9(㠝1f!1r+AgLEX<[62F+80''}zuF@wABitLɕΔR)a CVVm}e3( 9@H -ؐSDC^caW%V6h#bBcFc6̚ hȕn`,Tg?K J9}i.L CzRJk0!1ezĦ;)5`o#SjVHNA(7O;3\Mwlv؝vLQۜ-wKǗ쑣Miq>'˼ڿlcq3^\ONOS'Tt"H.o.=<@U~39 ߳ۇ_/F mXVW}-m$:]m*O&`qU#LIK&Wk "{tzZ$2[jƾo Ln ~n]& [gm1̑/>x_~D[J ) ~Ry͈yh?-͜Q>Ԍdчۏ+5k-܌gDvSQf}(78/׷MzůMO7 ?'d4 yk7¾ MQ [WB4*x|=C}CŦ:/tޙ0f~4>Gm.^^fZ5Tֿ9nARY;? Ua3zo?B)85 6lMdaRdI:s0zg75z[̣vM?c+ꓭw?f*ToY7SqYN=[XSCwM'ښ7IM)G?e>V4fC4)K0$ٯf ttL'UBl` |'['5͵Y%zIisݪ6Mr$ZFUQAd|xRV+ܡ j~AvޯWpp9ȑ #!;Q,@M8*ʨM϶g_*I q(y?1o%+M`sJBpVʚ^cZ1µJcʀ|~aNgnŵ;!Nk:A{]m6ls@cUXEcm1iAvEetV>ǜ>N3z CI[æI, M~ivRzDU~`#hzF69!ʗc<\o.l0-cW_.RbW_֋8)ڻ{+,/Tw=R +eoʃNY:X`PΗ7aP@UOU==ƕVS%[ kU%/FX_C3ݜed!f4*/hJv/gJ+Â^_GPA[b>wƳC!np/!x;h;#fReF[ m9`GaZ W(5&Dkv Y]Cow YL|1qeƔBPI. P1 ,UY:p.U{Oie;Ϣ9 9_UZmwxK,Gqcu E9[K⏉F(I?Wơu`߹}JO\wsgʨ{xPˍs5w8bNIYZAN޸q=ӍO)a>щqzfzD 5P~:Bs kӁ423S׷-t{-vy ϔnGYqhA:&agINv\q} |=svG^'}qt`-v>یszЁb4܏+ڠŨY\JbtaY:\s1mx۔Q8a1-ݺ'Nzm`K ^_|Ս\GNֿ۬W w7yh|-OoqG@{*߿#fY]P=|Xkgc ^\)?;HʫglIB=b3W}ţ*1Gʾ[DݷF|C3?/~xvuΘv}y LXPlak8 ?G-<]cPZ^^LGFrCГPڽualϬ8jH}lێ ۿ #W_ //\_k\mR/^V)A_K_ū_ PhY^}$ul:vv|iZ?lwyw1RWW~zYo׫ ;[w}Z7yk[O]櫋Se9rw\7A,t=> Ovf[+X-^^O-?Lcr p|xDT~W\ㅈn:ʯ/k0ӝˍuoBNW^A x7zKD~s)O9'2kwOvo\obؼvu7#,~w&E> vE.zKi q4+\f׷!厼= nan `]tCyanGeb~8/>6wǟg6Kn21dYBRBT-†iPzG#֪ CdJ S#>劉ƪ2Kk޵Yٚ,0Jں{x/dp  <`u0bAu\n\\oYxc*H[!X/O!Ec}J:d[ :Ojm8 F2U)hzĜ`K-OnEE F'+sVkazp /5p;Kuqxtfu&N<ɖnJ|GX&O$s %µ`s8PYAg\0jF bӪ3;yTU#8*qHJE6,$A}V}ƕ]b-`P:Z0*c*)\ݜ`세f qTb5R L6LHpe2XqaNVGKTP:T_QT)'L+| ja%aaAƫE G\iM0d黒SɁhKrUe ݇;)(yE nq(`QB'A-)J# I"9hd^P>8>)]+g0 7~OE#"Ar|IEPV/# ěAq2]Cs2YЩ ֏DE}ugD!m+ɒF]VNOU!r{׾xmx`y\E4Χ,}/V_b=^!ExB.` Ɂ^Z@mD%\uj /o#TKLC]JSP5KrN;Gd]xdWXIfVh"V%mP0%@"}S<$ }4י5H̔Ȣo&Ce@bEEQ5\-J0l" %-@ A169ـJ6 L{ʾC:ea9MU&%P؉l B.]i[q/T$XPˠf-w+(a$0 J[G, ~0dJ-66Ѧi@íO'wDxy2/WusmL!,B` 4(ZhFQPmLLGoH((r2HuT5fZRQGբGKDm2& P.#, 4#BvâDz XmdJ F EY%\`SFAtP%L(H5&PAS'+ ,A7oŰfP@>nj XWTC(MjUU;Yioog. U :~BT/2jSuT/FBY, 3EU,+~xNmDRhcSBITnLe$4S'5֬Y T)JK`KUIu2j2rEB$jd}5e-9o)T>@c t  ܉`U\@6ai…YX)*raYSAia/ZDXU?iDF:LV65>d8N֞ VғtX`eSFvL!ؤQ)g\$!P](hÕZ Ye*?u+Ay!tUQn[jwݺi܌͂CohZxџpW'!$ڻA{ R<'GBz^|s5RozSvxy{|/mv.J\l~h17qn77S8?,4S;\^ob?\&Yܾ ^ռW]$ a_J>l /F}bH8O?v>,a}Xb؇>,a}Xb؇>,a}Xb؇>,a}Xb؇>,a}Xb؇>,a}Xb+M%}:`M/FR>Q:~}Hg}Xb؇>,a}Xb؇>,a}Xb؇>,a}Xb؇>,a}Xb؇>,a}Xb}T?yIb6/H_h?sf,a]bG,a}Xb؇>,a}Xb؇>,a}Xb؇>,a}Xb؇>,a}Xb؇>,a}Xb χқ{iWWq華ێXse=6Ip4c+u3tob}d ~]Om eT }S}mz{j!Jc{NZnƎR7pZ^-#jkm[3W1ߏl{gP`fnq`EAQT"}%ٖ?ȲzPsCC&EBTs/7E./R+sB$_E6DN rQ^XB}, K)ZTo2]jՃ\|,ؿ8RYPUjPQe,cZsLk@fi` [= Ei[7/gqsdO.0=KК,+zs;kb$Ĭ9X%d}Z }bO4K.P=x'` 񞀥,eOr7 X#DOJnzVmVx !X #\,%B5Rz,F $'`TedXL(` K#=qݏz0)w9 ;u?nok??Q@qv58x-OIĉYysTjyL!iOr+yOH_ZQ,\DV6f8 N aK`tC+S  tj0r0=k}ڳː>I{Ř-!TXn{zvAqQ7. :NoUHYf9Q*B)#J [n=PJB^]X;uZs@ʼnWsHRXvHßG(g`l&E.,ή!ݼޛ.Yv>Lǘ@2)h"4@Vj$OKq6J\gTZ(*θ_Y6_*m xuçY_U.~-'ۍ_S?35p$P !q XB0k!j!JِD8Fpz^ͧ$XFiXތ瓍+V*OH,SB89X>qssB)IXOJ彷Us X `O6*HA{kB X>]K15'T{UZҤm9)`d]gS)/̽q&'v2\pުT7Nڎ22Bʹ:OLvOwSrbr?S#)A+1S٤PZ2ʏ~;7)BJ 0?cۓHԼվ|ěGIxa45Oq|ok!\2oW?QkM#v?xoh4~~jz`3͕Kj2qq4+75(΋o]sTUu615[Pal|a_ (5Bh`g`p ѲP)x5B 5IKp`ၸOKB^b:.Å'ý'3գ'o~<\7y} 'ED̕ho̕{c-襛+hTqv̕qo̕+P_̕G^<(kSUgs+}2W1WbsR͕G\}3L^"Dun|@*qZsZ-NcA࿗eJճ^b,SYa V 9RaX}xEҢaٔ%B@Ӽ2NzƟ޼%>_Y[GҼ8*+%~L~ a\<3;+4˧#gʵ?k7 DإWoPaG0VZ=-zګsyxC$nOfIo־vo.^Ŗq\GR1}Z,AǏm*T>hRH#)c6"[T47Zv5}7^ AMr=20Tm1$z2yNuZM' &\~2Um`&̲`:'vXp2+c-t=/>;UK` b&,j`M cQz \/2]YꞬk?F-*Qm{n+xzgmS/B 7ϏOS031M-uj ;:F4 "&6qd =Nzvw2m1q$`EV)bӄ)D): ׵@]瑾Vu"4u6e-{߲013҇Ÿnzrd[pi'Vumݵϴ^˃:=ÃlU1ul hZ|*lUZ1R9I1ŝ#~$C ^tWϰYsbc78pH!5LNBH(]^V5rgrS80\KCX 0LYF,l'0%AK Udt>Sao[Wd=, nM$oE:cVّťIM@9msHVGtsMӗPG.6{e_s;M@rB.NGW%f|?MD"]dGPAlP|mk*[|5h tT{0>aZ nMbmkK"om6Z~B]kIxerF]@4:Lz2 ~ ϭ*1>)ov;8k!Hs^.. !iY yxض'qVzJ;0+Ѥv$fD7˒&4mزչe[ɰ+:._(x(Y3mI~$'6AM=iBrܲ3QzFVv5oۭ-,d`ޒ|B#) :k\!C 1 ޅ+: /~. ,ӣDh].stꇮw%ov7{ƭ)w]~mMfy_7(*3p9Cߜÿ%/JML-ipT>)4,**͖w˯<}m dMn/ i]R}CQzX5/f]z|G5w+U+wy]Z,!Wp5 oD{g>̿](iٲz v9IMw: KL(Յ`b2>ߙ~߷O\Xp̎{vkC’T(b(‘fh#p2S"!&F!!E0l,ק2-z&ŵVLel?M]_o\-fwITD8nd07.-v?&D>{EȅwKQz˾-fi}B 淚KMMlgVH)WE,Fu-/.-K/=)M.Nf=h&\>*gTCl4{jUM n^)lW8^P[>|d6NMRF 2[s&I|O_-r˗p"|)"| Lj8_iɨ?N}bv7}Q vW+XWGt˃֑#:6DGX)Q!BaniPF0)a"A :T6~d43(M.K]e}jetn`P?|0_XK,pqSSK?rk^r!yFLOJX(g,R"?8Y\I{`D`,q2m1"&ScgMpb(G鸢t8'xLRڹ(d5a1cE1qMM桐!f9_>ZFyp +yvT,u冸 /m_-6͈Fez:Z ;FEgiQ&&5ZIԣ =h_?`64l#.ʧ͙; ?oHF, '}^qPIkb5.ZoPrJ67ݡ[Թgi[Գ*|GK}ƍkZ•!^›63C=xʕ{ 2Y{\;~ ڱNٵ?w{?0E@ Є$@1 E]o#eUricB)7j"sit̰Y, CR6&a("D4>gVpYǴlJW[0v5c~x),;]'NJ3ds:ʆoQ8fÎd( H98Y|1( #*ԁSLh@`ŚڈyuYZp6+Yr{\)wrt9A'4JeD23SșFRSFDaEVZ#}UE}eijO4ɇ6fn1w|$!7TK{=:z{Jb8y/T%e_h}\ ̥>gV;̱[M{G 8J`4آs MGn~uMAJ8Jo6/1M[$7+$'K?dz 0 bћ왮;ݹuy.]O2 š#vlt;'F %*L͖7/Ro^ pspxi]֗V<;{3"@[}o[gܒY3hY1ϊyՔg} z5U_9klhL,nT?T<@x~^J>:?S[SR*!P )eS4jZXTTm6c=py5,T 5+)+0ZH!heY3#,Fxd!cyWy^ aQ|a<=mU4{( cbiqdԘGMdȊ<ɴ56se լS ;̂ =_A =r:؋jTsݣ@[|sez ^V %Je/rIUd& :TppUϾ GrQ@@ jKd=ьrpZW(L3UK^TEI0pGg6ٺPm̓pa\/k| /YďZN6I+cCy2Ѭ}Xlb8Yb:TUf*2[Ȩjf/o`7K_Ov_UG[¿t{U~.m%m׆DTt>sX֋7TU.AU-6δ:U.L7؛Wf REX}IKi0u9&R廩]gtAhiwD;(~Lf/ 573jBhQ!4—77 1 -j0~1ofo>]̼,f:drg]].j@է̳)֜T}UfFL8 HtXN$da{SV?f%{4Oxk鍷++).nӻ :mJD& ߟt!LStb)DZXG8,wR x%yxԀgia\b"icrL ;\R*埥YNEZsR/*@{/Қ\J1Y9QYΘQ?%}W ꭀfm"j%@v^ӶbyW;߾ 6.R*e!Z󠯷]sӴ]yGr-{2%rʼnW9WY> 隯v ;yf-g4Y.ɡD"'~9[Yis(u7~# drSU1\($K(ja=Vɡ2֐ַ5eὭKXzezqw_[U!vRTM3hZf,`NwG*Է}/UY'iI%aZUV0֒BIj3 dZUe$ZI {--qj>} kS"M_1q\9DN7]R$(+kDL%l~.E(FqjMZNm7Cn%Z;ԢE: e vN&W,kc} B +HdMYp-7Z!b:h(DyGdjUG WVᣕDhŴ4pNhpP(x_W7Kgw"ѓp?Mba!Ge{.tj46o=۶Џmy/`ޖOuI)jR宓%#z%%}s'@w\Xq(@ IG3I [T4;CLj1+s.|x@p#;U2JЖ9iψ>EޑbYTGCb=,yJZK) ct#s5ƚ>,Ǒwd9`X^/xDSK5("^k&2LZƨ%&DweRNxቦϞ L#[Ѳ9Qypyz[3:g9lTk1m}{5p>g:aZet%@HQmp2% >vC 8mĉ!FqӔ4@B9[ԃV'YІvCߓ{i# w-IS;I@Zf^<B%!! 1V._$4.Q]7 ]R`i$p}-%Z"(Z 1V8D"ncdCyC!n]c쓀ۿ>BS  F+hZ$Ǐ2/0mwdc[3/5<8rdxVxZwl8C=A)|*bbVi7N]"zlDwsrj`?wbr(x'LicrL ;8:d2P\= 1Ԉ R VSiIyISa%Aj+`ӊ'>FGtӌģ;r]ij.xM)Ⱙ.e&VicrL Ȼ\km9YjW{0CkƬc@2qUX[i&m*"(X(Av;͠;#D6Y3Y(#K6E1ma6#@6ڼIfݔCn%JjnkMUǟa%^I3^-Q1V8D#h=tp*Rgr6wm RD"5&IToIsÇZ1C `zWx9%_[gEe,-013X<$)뤣10QI PDt,Љ":zt5'7;g'fK . nk&c7#d>$ 4I[* @`ɪ"hz~$I~$#7m1:kn)49/^0t@+S{<*9ZQ?횓ח{8%1怏U_NBΔ=L@ 4h֖*bƨ BAwe}֦.(v!5<QF+*XW&)=i>ҍyEޕ=*P rS)?/&U.~Bh&1o-UCrR^Su0Y:+ȞkDhD(1ܡ |MA`H$O+1;H1p~}[\) fO\oρCL >TMDf&3Kpgd@Z3@^ŠG kjp}o1W=JVx2#VNj|ј FȻ2Z=,qj4IWsK;[&p8(Ba7Eb6"]FUy1o'> F[U@gc4esD:`Ձ`M pq6^ڧbȻr\~rs e9lR4KRcь()"z"Z7ȻYg H`<;XPw;qY&ǫt OQ3׫53i;K?i0VʩW,|K 9 cѺilȻ~ AC17|wy, gk\Yqn6O)ޒ\/n_P1F(xEg5@M~~?˗~~ȽkKgq,&Ewez~98~xde1܎ѩ3–Kdr_]J;Z{74j(LTo-~"S{.ɫ&р)],S?$@Pwwf9qY>`c6<̺5Nbm M Blnƍ1;Ud 20.1c: Ӿl"j7:*_px#0̙=l,hܺB4{_QsDMt"rɆ3'ݗ(^0#Ua!җL,YZGwYQ!h扡Ȼ\ w;QYR*-cYs.)\-c4*NHYq~; )%89[DxͣgQnu*u3 Eޑ͊.rQ-c9@nϗxkt9q8_ QeƊFn20I w9'!žClJhg(iWU3 PzK+Ȫ~~ʟcme4)G;T#|jň-<G<Gxja1nbuijXE>TVNWTۻL`? ?\!,=+7 .kAb8P]r5aB,TL f GE>p^(:bjwS꯻@~@F2&E>)aXJ2pp%TDQgGxO_5rR 'C;r[Eހ^\Cƾe5*7֡|ú z6^!ץ EkcCByT1zR(|{v\Σ? :dĜԭ<. ȶFB#o;#^DUt]Ō|@P/F?G"#V,,W{UnJx c?|Զ0*9 i%qJQny&r`}Sx`.6MXsY{PnXElWxĝe;i+J Pp:k)1B4=L[.O(w2 ȇގWRr%ubE !3A|ě$D_<%jS޹a A$ylV ߻""7 WD `K;fx|鈨0S'#ZJmtm>Q5]m>#,qt+蠲ϴщ0ZNA\\ӿT#e]ۈz,"S*n&"wNfZn YAI[ NFEU"ACr7LoT|RNǀuȈMXkcrPwλ{^iɈx]UI+lyF\r aG\rcMU%}IFw5\_犯)5Ʒ)uս2MrBwRp`f8iyȇ*)VInt:2S̈k!%M@:p=Ex*)l1ňR<.i#Sc mXbp<%)R[MFU)\oVp9RǴb_#l#鈮aYM. 6g}xѨrPXʯ׫4m"=$>B~r|MUL,iadtiES5/iL Th(NS!~ yKQLFg4 S;A=Kt4: Z#yCGҀXlS]L3sJnWfH-7@RL/yٗZWFܮZqQs{^{EM 2X؃҅<9]0QC:"Ņ2R;i&/$B#FKib.LBiw EUf=LE#U<Ǎ(z$0%J2"XR㡡αWnt* Z7>aMǬ.jMX^6 ē'J4̓1+GE>TzSP2"1xe^~(7$w gҳyUTb=Q-•c0X'$E0g*ot{DUG#ɽrqȶ ̽uKwk<-Au|Vfw+!|WAͼ(s Zhn oGj믿*wnvן~:\@} <~VgVDBE,rp\2/_eo!nu0n>&gEU~W??! o "u_oj)|n1yIŰ{ ϒ{~sg?h6Uwq햛Ef3ɏǚ ?w^#-0q^렧tx?TqN]VMn@lROK,mf`*f=s{905t𮰚pAKL8&{&Xs7lB]sm9DSeV1޼3S}mzoܼN`Z/Kײ瑪ۚunS{؎?%;U-x~>:K/L aG[> m-ޕ7 +زcm#`l>^wݯ7Ï?7.]2g^TׁSgqڅaY[}ǹ5iʅp' B# R&3 ҥ:ϥSJGPJ>,~PQeށ?FTMu=̖we oyop7șevڡo%-fUQl>?ۺ[,mpV;y]`{).;g߃^MYm_]Q%jkl1@`jnU0?Ͼ2WT;b`J˜رmno v^c/lﵢEʄ*1Cn$!&5Ut} l7~gbVkL7qcCGYPxf3^ְ}+Pݧ !OǽLL^xk@#1,FI6bfc3Zܐ0_is r:H+QD2'~Ic:Tڹft[őU clr[1fM2x*Lf(ʙ80aT"^]%>-rlR8 q1&)(isZzőիψeS 3: 44>@y6ZE<*|V.wy^/::ʎsȢ. VY<(00BG`| B~XTϑłl1fg\(9fk<]DӍ.AIlTv\2p%;.U QIѩNh>އ"rƢP]Nwۅ٦vDZNS|!"XLw 6'a@e4ne'Z pMb;]^*}M6;@WQN<٩;`-eDlSd$bnX]_O\yМ,]L4,ސ$9gLE :57xيKfP>'R0${@2fdExI8b<$oLr49f~45Oc"OLp)M]G7ٶ<'ܡV*>d]Ie&м'XzW_ 6/ = ro"쁼M/F:x[/CDܼrsK rd2z(\%IlRn$ʜ21dy0,3G_YuWн0^:?LTN{Ηn>pza/]ξ8voXQkI4%YHA2z˄;8NH>~y" "\-|6<' >vr,~}p}!4%& MF$A40 -NGmuÍ<>1g~C/f-z26_e&L@i{rDsKSo+wuύ_a%v9UCbRyH}y 0Z(JKR򿧇"g!uht7y܊嬥󷿠ApLM1TH,i}u߫T05};E,O1;n ӄy5 |Ajƭϳi`0Y9[`F@ֳ: 2&VyӷᬸB-[78̿K|>M U^5yzu`x/[h=jyU6MJAhХ"ŗhЈq%%B',ǒp/c4G}$&yfkt|щmgdiW+|rOr\{9oJPIrU'(uae4rVRU`OC P*i{F\wjG}y,VlN-O+Ȩas3g2ɥz[PNyDM2'OI98BdԘo~5(Hwn Y.Z[dx|*3ΏfiCyKCͅ#L8?Zlq^h݇:l!n9W6f>97*xǛ:8\¹襜t.*ܛaBdD ͊{߲v9llؕ1YQƑ|֫=/]])jrxPw ,3[(x|hr^ʷ2ֺC 5_WFY 1J)ڋE,:!}T쥑LjHJuި˘EZ:~-ETuy u cͭ3Os{壇c(ǯ29{ܱ[sc-}~ 7*?p|?m`NdE_T^gir>n6oSmnpU;j&Li!TJk.JxU4H#R"h˜,[IޏxP}ueM HUaS<9G:.]ܢ{QMIi8KKe(K G^^8-S *RҚ2p'RZS0G9P=B8ˀ |)oyC,Pz[3vкu E+(E:VG0XBa ^Ǽ0?E!UbFp@ H < ,ޔݰJQ{ym c=#D]@N:X}cyɣc0VEx0<lz[$Ίy8ernrh(`цyy@fA}R`G:8%JBW#.CA(/ʃя 9Q>0ό!<& V0ZH#҅ y"(vK蛐4Y(M <(;YXKzǒGq]b4lavjbձZ%PW0,*/LEjIj)$sZHyt+F}uzO%*j{ͼ/) <:c` <:شU;U Lf8mQyO 5J$v*A&4s.p%2b|ѦTLpzps ~緈@%Qe/P]RZ`2aN' #e2)88{;BGޢjѱp):%g at(|684$AiA Ý/P^rXT\|7E|[[GSB+#3I[ Rq̒Ga{,1e4rXYB(M[;|RwUH-5X8ge{CAmH4yV(pG| 40r̎?^^1U(~.poUV3҅] ZcK&i<`h\`tBU ljՍ3Vy\ӱt= zړgԞ=^\>aMbiSH0}I* 2Pp|?e0{r0??`j AWͤN_??ӗ秪nX1 lO>Ӳҧah<& Y?M#6X>[$U_j6EJM.T̘#/V*g)FDWKCGBZ͎!b)z? k?PI*H)9%ND"I]C/%vkPM_^]C!m0Xf1^ex>s0U,6&E5/Iڧ\Y> ClNjEԑ}@m7Š4JYD=eUFC:EiI~^ۊ#jC(+ JLdCᴯ"`Kz5GG77^ 4,jMe 1̟qۥ7x"~FFćN6[1S"xGs#dٳ맑penZUy-qvvtTmAm 2F0JWxcB?x_D UF&#C_t.=@]?&qĎڄ ;OQu&PǪ$7\ԡɞ9|h2uwUdWAFzU1F3'ofo(W?3[lB^,Z.EpR q-[;^@U_nu廔/)dJdbj/ZԴ NGJP.rMyfz1& C|Ŋi1i!zҿBg/`!١tzREp(U qU<A+└,ʣ`Z4Q1Ψ/e4(fMo 񁪍ӽ9sfX|O!c5ʮ8Z|}?ǯ~ݳNQr|6:AQf,`kyH 0} [Pg?GTL81-+_ŗq/'OD嵷`x*Iq̈U#6w\}̲+jIsyˣ$9PϦY ,gdf¸UcQFS RmTD %0S%]J"G˖80KZŒ颈!«%mFPHT ij~QD?,)ߚ@/M*X[G 祯*2 "tH\~/o1+Ԝze m 1+|mV+{=qCGmlGksf11LX Dϭ+c(v*ͩW=ƻe;:QTLuW81L_ш.GT맑?M) Dfz?i+>Wf.G<w#};< E{xg{qnaZUd9iźAUdIdAR+ ɃIjA`ƚH?Aw $5 t%\*ȶHr2L(7<;#Ljƴė5)?V8Tz1wJ«MLeU,$XAoz; la'2DVJ +3 f*:D,YQh]x'N}+ 3-5b4cQ~D&;X,hhAOQ4tC bTURENwG'4^}3# a؜z(Q 1a| *tƻf^xl|Um̠]P~ :b}.Zm }qb}YS<3-K-v=-[w(<ԇS:?p.>һzއs4%<בjp۟ߞ&m^6 ey7rn6Df8΍WoKja/0[mc/kb:zS]d4kR%V'Oɗ''2or_:6XjN!/aN DTӛKPUÿVʲ'93ca ?L#m8)@"$A9A9){0/He9Md9"L &xtJB5vuq*n>FMY7o-.*+. EilAY pi , |vU6U6eq*we%+'b:3iľ]y.φ.8e FːPL{G[M`;n_o!,-%/Uw(4(KBlqB-1[6sIqB&@[;b(OH k2 +aOM,׫GJzxDgණ :MTuɐIc^5B3 jNJP !gaG-qOp=)LX|~@qY*q^{Sx/NQ&-)e(9gFŦ7{?l8W (5r5.uPP& *Fu##:}S'e8J^{pq I}!Rc:?fz{#*ظ5)@|$պ\ P,?X+&*}?;'3z I%s{&a@b{>̌ғ}P K`d&.ϝuttE $_U?W.o-(ܫhN;q}fF6,-q!:2b3"5ys_'Ќ~uƓ8*C Z_n<N\e` |a;.-_qOx`W O 9rreL ~(>z}=τf twV3]\WT3_Č޻¡?[$?9&oQ.IO[v]?]~|CO}n/s2 W͟lhzڱ9տTŜ0\ڛ0s7KtFX n՟Q~x h#H)Ydg36 Fd%95|2p}h&7 E }䏗dtexPQ_yWip&7iF qwOF~7`{3|!*N_6)꽣o`gD4hP*ϼ)wp>ƒhIZlhCpg-\E7FHuzl- v=R\Ӥv(¾V9V"` 1F*^nq\ަvF8+ jcw4:_4 4;qw9t8$'έI pG4M{kkC46yeLH3U`r{S ){n&CMwЌLЭ)hIvRꕽlԣ͞PwH1\Is&j|JP-0$hatwdKY͆ƪMdmF-$MeQC!UL{>=y*(ĒmAZZ\ 탴l^K!s軜Sv>fw"aݖ/IVln^mp5mh0>̅, e,f7ρ5efԷg΍K!&4y)?seW\N,W<a28P+}P9weN{Ӽ#~IN_wjOKt2k_ l{m(Sо Js~}mO #n6ÈvZmnL 󩉃kADΒ:dXk1Z% D3lG7Eӧn<7D(bqbx'DȦS4 `nF"NĐè#g2kM*K=ndtWΦ{|7r\;Img < %$V7ixa7c2bZX U Mi@^`ʜYV6^Yyt:O G v:4Ɨ ޿~LI8]L Q3˅JMcۙ @` 辗A=%Sl:: `;8k$GcV5Hc3Gd| +UΠG[W%lEdMm;F<7Erh0;ZX9^X߅O\V*tՌn}{'oG3kqX:9],tZ' -0.Ok+6ɹqbb9R9}J%@ْۻy[LtxE*OI )OyqR^sz@Z5)Fup?3vFgapo &vN+ dD8HCe=-҆/y~ܩ.9b&~+f@|%# 3y|! @,9`n%L T$h :Hʦvh"LjR+t/)9O+nn{1(Ζ77?{K%Hu QTx׵Y/ڡȡѾJ-~b/hӌCvj%&M &2buXRct7q~>Pm$NgRYJqt@BnRo~Sz&Q㏃^.8s@aJygߋK6 ӌs>k_h;v&@ %=(}"_oxP?mM\SafwhưD1 (d Lmtba0]E9M9D{W |/*vo 13=Bm-qsh)2|9Bf> .J~-& N2`X tP^nx8~ЕPm$B dHjU՘0=n{;ȡh8W% HII!k)QRhm[!GM#d(8cL4M<!9a! Z "|7D['=#7rɑ3|ub(0b: E G hȌqz<}_hR[`dFTm5!N"A33=RJUVS)ӆ*J_۹JZ *UI 'Q#3X "D3^ *F*R#cܖ )kFŪZ$sy#Koh5d:8(wF" x$hFf`y:s?'z02#?Oć[SǮ{ ֋5\s0nadF-AC}t1Z͔#ke̘`c②j\}4 wC\xdp'Q"vau)a*a`Ytxv;cXDhx\:PDte m99S6$9L@f#Ł)h^qQI0r#q㇒@{~K%=b|!.R~һ+Ja#"#Y Nwm.~m@E[?w[,!'V-Kfl'YÑ,Kds,rd#x?}tưeh=t.f-j? am:ψŎ(6-nDVKۭ佉t \og?v6˿Y3yثŃ}UNNv~x-B<'{錗̹'2E3i/5 KzI}ȮznbD`Ş,Yi0[7k&i;)?]BgD>>ǝeRQb]Mm7^W;{QmAg ? qӆeƓ0 fh t_|=|O !X Oz1w 0_6>};Flxu<٘><Ʉ-fsdžIl` r5^[7^T^IsW}ih%i/MJ0CH#CučT B(kXi\[Kxb oBj#R*F8 YE -S@Q*eHT!7M eK A[ LrUJ`rqo{{U`5uIm$*dkS#d~%d>IRx݀O&;U3iYr1ȅ@0rxG088٤ gW 0|0BK˲5tY4 J1Hɸ1"^ aM:SbMLw=O;I0(҄᠌.2㓘:{Bal2TYxQbIU- BxRȫ,:QD"8HXYb{hG.t%w̕-:}1xf.I>pa/EovU/ GHL .R z MQlGݙt*t8vYHfb啑{yp 8h};xWu O|=,a@9̠gIxO$էWɏ}>Ia2UWq^G #'a-X'AcpK4N 3}i*{|FAWd1J@,X"G5-Vm[Q4 C7/)EvoPm~?wO(ʬ QmD^U5 Ԅ ٭bw4 A߮ tpdG:E KUs!q] FapNQ<QRbU*"~9f'lb=LhiBtoc_jbcZL:P) O~mmlq1 3W\[EzJה P3wJ %*̪+涋_-\ gI|K8ɛ5%2 viTjm{ԺNZ(mJ"bFWQ]aI,88k|Bi7nJx!kiTO5wI }8xXXR/} b -8^:´% &-+WHa0S'ks]lfZZFėѴ-GdD C4)&_KXB}̞ C"G{+>~iȣdcp&4RQDAd I њW%UyZN{d~8Zy͑Ta͡2{"(ՕC_o6wy."&KnJD 5&H]N ckT n1w瞠U,.r@Ԍmm [5=,XǼ3c3VVҗ4 wE 0 AJj]uőۈkUPrG.fbfWE^|EVR;+?qU}2VR2` f־>@hH(,}#@==j֭o?HPc( N@զmӹrJ@gvUz$IX{h'99 uVa!V>^^c0=uϓ̻Fap咔atm*>ڎdMJ"!çƶL.a$)qM(wK-&dop?fka\NwcbI6D۞l*?(<|hxJ0Gi$ z ehN&,'e!ɧ))9`t>=jT3Sa,ciSvqcv5%P:iI7ij0T(Ғepb"2f>:|Q)}oDR1/ DHxnָ{O~QTrgjޣzR=4RY k) /Qn*eyʵ# :h©F{hlm&QWW(\?mv }D(Rw͜"IڬR0F=ʙ$]{h|W*nG=4J֪ Y0ouRjd.Td K@tB@K+J[ʣo/߉2zRAjQW&T2)?fџ5 d2ukQn^\Oσp`Tl 0V mO{h|EӤ (bE*ѵ܋MK;[,x+FW޶a`4[s;ؾΚTv67:I"reƯ'׮6zILc'5YWU~jB$\i/Cj?\g`7a*pZXy0?n}c^?GdIn;'4MfnM7a(7w&KbmTF|t=!jϝo75+ zr'0]+Yn@ݵOc`bm@z 7gI ? ~͹B Y˴&p&`bI\Po3K=@0F8|iO2}zƗ Զ=,Sx:n`;>^O&;y߮'~4#FY ntsG!*hrw wƪO.bm~^Lg執5&~I/UhYpywbtpYw@xnW`j^N1G^?-5M4hWqLA[hmZFxSCHOY&X~~ 46R- ⮪~ Tm)c31< {[+w*fpt+ff 2`Ndk^[~zdy d~qcj\_["'r׻6@x%usۜG`fӽW4hyw^?N>TsG&ڊg 1%*e=yKΧ? ƚdWRLk Ia[c1ZvVb4/.{ ѳ5ͭq4co/aR^p9"c^J@^5#p[3 TքI-jXVX[[c[ct}u?mjFh%/V&6ˮfQ"NY42>_eco4H\Z`X2kC؀iu sfAJzg*}>}|s┌}sA =%R}3VqdGp;M^gg;?E7!Q UUEUTԅS[j<ѺjGqTDZLxk @Ǒ[ib"$. ̥>qطJTwBTN6;l${#C97œ,"5d wf$㕪o8_{AvL#>Cfdzl.'E/ wWbMx" EM5N?{Ϣ6hdn2l=Z/{MlN0NlVjlm0b8[Ւx<,cϰ{WWUHȪQu-&+y˪Ɏ:Xo"bjK-ZК[qa:{6͹nஞMZRLusi ϴ\M0\/}Ξy{z۞ 5#n?F=0iO>u2< ڈmҁg [>ěϸ-$"}qRfd@|ѻrZ'}MGc L[Z_))Wuy`aSd3 U 9Kl2 ;DvR< a^3ʳC׉ `^ w q70}b{GlAO@9W5 O6Pӱ . }aڿ3*g]idj rJuy\Z QfaOz\?Ca R}QfjrtSîe۸yaԥϕz |܀ɼx7smԫDn*KD #Oaү|MyGf"FF뤫jDү1qh'pa)_ So,N0h@q~ee]=uIȚn(Kw:H>g,0黇-^2`f'Qǽ -uWl)>Flg& HAΈr!iGrW k0_#*]єt0QEWUQ%ŀOҼ%L3cY?F"-[%:u0!:BP l$hf!&Dmlj8ty\20u0 ٓE=Usg%Ǐ#Lu>`s!K_m1MI]f:3TYIAa3|_Ż,jTqTC=!Kda,k:Gɿ$ݕbDCoQIOi"X810gΌi9rwõHLU~6u!@ /:&G(YhuQ8n3*Z[{kqo-a-8+`Yrm%YE9 !Ex䊮*_WK}"KW5MVTUsj9̓|Ms1QE+[iR_q]S ݐT_[You^xեY9h1x>yEnv( ܵaSqσW0%f]ɳ5/.h.sޢK|?4O< ʨA | >WY0=ʼn]~A$Сe&9 nA~: f\ 'Rɒ%[7Qi/,k'\ 4:GuĽUK]jZiWVlq 鲮:wJW8ncfM'xІ) &Tݠ:qu>dZB,F,ݧ̗aDiUk^tZ,IA`pe vQ|O@.5 ә.*3))V:LR ZZquK(n855.MfTeL&0+L1d5i2q6vjSYm % 82w|`Dst87?nps3CnUwlGhY,Kk^{}}VeZSm}'q:,Q$>:p̍T,}UmmZ[RH뭻˷(Ӏ`3,bW@f([ ,W\Ð<6KYӪJ [@x#NGvuTL6ǹSiT;/yU4[ߣz؞tv]̔yk ڬA,*9Ĵ4Iy&( (h)1Txe8/:Wղ?P=&hr] S3a$4. ؚ+>K>SulF&[ko7)ӉY&h5冮JcLu-5t$oWsSm3ƃ>fx%O O,™w$.(r>FD,uQ]ƩBO4]pÕ8ɘ/G,odDL1Y@M3?81w}ಷ,izT%;, PKC/[`ru&QfyR}tS5MOdcؒaK6-ٳaK>P+ ;4@8`2 @"8eO>%E\joXwTJy082ū-mW;AJ1¯ZLI߉[.{yrON=+lyL FxD_QbGˉ{^;·j4+d"6&ROT1`V/iSU~2ȡ !;VW^Tֳ#Xq=;vWil p=D޶?/58vٛWv k`,? ;qcDLcNƄ0L [[~,3r)|9u=`M ]wDo ?UZ y#wwo~V$WA޾fzZiIخ57ST g'6q4G-;NSr 1 ^B2!W"$%AS=/z$of>Mvqyw РL5]m|6Xw2 Xɐ @'}rʒ"t|Bw1,Oza_a!2Bf^ePd.c}A{O=sVY6R, R䂒7ol\2]H0dYADhA$O xWƜ03+HPDл3 `pGǧUծ؟jSRm۲b'Ggv1VwyhD3KUpy7@i"`"/5XmO>ӡ ?2Q,C֔9?fuCZ(ݯqz{6+e/;y|SIH,-õD2F+`bY/XbZ꾘?3jdaGȌ+YͼK/^.{v*pq{[[y۠%SlI줜RBi͒kgKh<zzK5u&G/u\wl|;!àgG#+Z(?o\SA/AHsnE|+o.Ks+GUfJvIM.f|a#&Ÿ<#gT3᜻٨`]о =:W[. ˀsXdarI)7PL>ycBemiΜBp:V"zQvq\1t(q P Hr$Z89׷ r`1#pMHp"0=++ǎ%LkB~~t)Fiu&}xN{Kٶ}2 A$ۂ^X^ F$ !zpٱf;’\l#_cK^!< MװV^%BrCOK[Lgjw ]p+xnj.3KfQfkGѥ}B`:h:twRj'|w7Qt7J/GU&,T{T=Έi8r1}+<9s5.+<._V@ bw ]fjfp݌;W+A+w7ar⛊o6o6| !) v77Zbv7iݔz:Cr:_* Y1fŘn+W;XS)'nbBҋjeT&u'lΟ 4$m!㐧F/tQ癮x ȿ_1TA >G۬I3H3ӆ+6:oF3(m5G Ř7~LXuEg t@ŝigRNY$aTWQAVVT+*8y8YBlꡭ.`. FpX"xL h}JgT̷ ˣa{kP j9؁a,_ڜ,/ das] y$ )-ΩɄ/>1Ue*U^vi>U*Pvu뜊N[[/CkC%%e0?h؁4M) GP<_:NhP74% .qCԸ! Lcp%lOgG'K&jUXUt!Z wf㻳Lz3~Ө0g!e߄fpT\̢g,8XMD(?dU ]B'i'IAm}?o)ѷs/}:::z^OM A{dcJ*h)UHj>D"p% Asܦ6wf.rLCpגcJ RD{G'5}5=>7[@SΗ> |vV?yr>T0N\tX$rV.4dO;@oX'.Hyl7[E{{@丽絗/~^<{QH7x3fcbKYjMfK+q`ջ׶m) %@ǔgJ>ꞵ mQoSwǬcCpp(o3o7>vilO,CG+ОTkwZluJ2~{?{/aNwe@Ij{| Sm Tusww_xSK<Knw?_ydrWW_rnz 9ȷy1Bc^?0 gaS# 6 5+2[$R˱lb..Zk*xB\Se#e-%ѶgkBz%m^Ʈ?n0ed FhaqI 1z#U\Gr+I 4X(G8RA\b8җJu6ֻ̄ncߜ={=KTztX6O'=<ެeFOШNDk D봸I]k׋aה @j>I?-<خت8}+%.=WMUnP)b:r;TrEID1.XFB5ݸ(܎OE hoI%iǚ:kc$5GHuݤ *:Њ".:4hMWU?m >c >R΀^ZI' Bt[kUמp/6/z"lY{~tzLԉN?wAhVhat:JQ!) uj! .yokzO a}gƏN iԶx ᇛ#2Jwیh?_#9iA^i* ? Mí{\\Uh| o׿4]ӧI WMo 4I*KwiRC)v_hw7CCUSv4P=p%5i֘_fp@7c":5{cnSY uـsS@3x}@U ,D`npu۩Rd(@~ _eT۝~1[11`BIc Rv Hqf6y9 Dø 4}4 B\4 p'DM]8ỳ/4iT.}ʨ9_),ԅ&yj;b{Fqe}L&;u\&2ؖy%Waz̝܆˱,;eೣG  o3WhZDcP(2CqA'^R&&o&h+1h gvgLBݮRg]A(+TNbs(w~T'ٴK~ۅCe0 LfRF Mhl<_$1iHuSsEHʻlLCtTCY<3=|wzClom<$L96i@X3md؋k8)K=i%43'gFj67N.]۶z;:^hW`#k`"[$uvMITi?rQ`=X:&ySvP Yڃpb,}Oک2rnWO&}O϶GP]g5>5X|\E&4i]|"kbvZq [81p}.?[xfL[hz(:OԛsAakVv"I;3z k 33Sg7ÕR٭)جْ&`KUr-{u:Q5Yd2AB͉ wn==89xzmDפ[![49/0Nu}ma8oq+|ϒuS]ѫkXtI+by^l7c`(>Ԙ&9(Ca 4HU d4mn,>F1Ngn}7۟۝/#R@n7@ R M'@Ɉu=AAD c}M~ۚ2M^~Xp $2`L' k 2` dX@(jV8۰M!ud g67a!, ei& ˴mSZN@\W] \*&+=IBϡe*PvҽבĐ{KY*GnW&u!mdnwed^腷?S*pW|՞M/g1{T`OՆF\9Z䂭!?~.cT]$ H0#3OcW2jw [ {k|큝}E`Z6_NRv {3x$Ұ{qdPлB{4/ŴQWb9-'sW_|Nt[6RXq @ln__}a7R-MVEjV$gW΂=9x5 AB{tͅ!3>?Z}oF^4˫ykwqx2sFw'ooBФ~<:_ūfzxyդr36ag : /KՑ'hŹ4d*t]C_C 6*wUUtUR ݕ*CJsM9wUvUfURݕލi)-Pۭ--Ab.n/ߜw32% DIM-DOȌwXǁMz 目lLF:dZ {nOJ%4}1h0Y)]"qwkI,XJ,I S/obrIwuRݱ]Ք'˂M*n_aIKZtQyQcuo},Qcyf[œMO#_*trbVnL^QC n8I(]}q8=Yˋ_nKF]`KsӯJEKzk<OGbF]ɓ2_}M.?>MEx/jǙ})t1gϭWz‘'}Vޑ/U^DPXdo[Rg׏䢫g;Y hu̸ܞ7TV]@X7/9J!ϔR;|ur߁܋?p/yR˷lK_qw0503F;!NFSؖ*冖"H?55֏ -2.[OM LMuAdVm,Zqj(PJ8_%[Nl& HTL@0> ɏg?)+sQOhB=1mGJXMf3Y񌇝3lKU Jğ*|.|z:;h)Bx΢Itr )YC%1O!yКgKrAR|G9fUK#TOM-7~_W?Eh~)P%Z:[l$.e-5]Yy&a*?~~=HI<"pH 5<.Ŵ/?~9MnZr_>u1h`}pqth2jҹOvXXy!4kY!%.a]+H(?A~س|"Ԉh{4PQ[8_&txkc-:{Vn}Ba|t-B S;vPw%PVm To^ʮ7T(iyJqi4]<]ZM U ч yǴwzPٍx2;j2ETE=Ƒ'YLn}E4{t)7quT#hM[P[g$4Ɂx (WsFWHh^rcO9H-HɁ`vޙ?. xcEk^jVJY*g10#>t+$FI lh!FiN8hb P1x&r% ]@lYTH9Yk$4oEko| Mt@Px qL -[Z=֗k1`KĔ 92k$4iPǩL&2GՂ$HD0RO9FË> A4i -35xSLGS艀rUIo_#xN.I3F~R$x 1Ș1xAO49bP%.yFQ0ם4 - [y1}5ݕ$rÎ||Bss51ђ%+XLj|VxIM o!9zk9 -K)Z'6 d(A2pV%d'BBʚcFBvx͆A)!%|7i Ix#cFB J֡⚑˹Tjb1>G\vb$Hh^K,bdB*~s62srʏ0)qXJ¢peS&G ec%4oM0Q$O$zgO1'C*2TR%ZX:y9MK)G~VVz`Lf9ڞ_#asR!Z{{ 8^i3sh4cF|& kvSp ) I@ģLI el|fRyve)4%2z.F^9e>'$(LBć$S +8k$4*n*: 1TU*iTXU#=_#xAEP'i~@`~jz2ُl>.GJ1')/8ſc5|ْǫ_pLlqP)m\zЙ@SDUk.#P!adU9\Ngqqiy=?8=M#IkA) T>d+md*3ra"A#/7)c0cAxT[ Fk$4_I q  Ű.rL1ZH-15$h|>%C՜xp [ -k[F$SufNц5E:F&j$4u2V`qsx%21Z72r1y]!Ϟ_#x϶Vyj:(f/0n0EȘ6oRI%PJQh:_!x.Ek>(&cydH;UA0e1 - |L̒XCu_CS,c+$i7f 3PG f1>4FB q+H0CRf9İ(1,٬6|)y`'C)t Ty^&Nb$9${MhXrHhG.`j$(>Ll@6XFB ڰ$g<$:{őJbrPHVW#xm6PRhCz0!959sbZ ´DTd1 -[`C]I1k=AۅȚcqjtՒ MȘvK_EpΟ;f$씞ߺ;7ubƲQ+a/>-n(^6?.]8~F`nfLGCz!bh.xx}o67ϬvGqOz4w8h^|}0KZ~tlq,EB˧~^YV4tY ~q?i>H9ҙsEć .rVP8E U"1]Kr ?_s\co` N.MNLKSyBYT`N@3<3xhrf}NNS|QGoH_bJNc\]pge-$FASomDKm%n4&:!R)f{ʓ5[U¡13AJK27 `"cEW!0/gL9mQeȵ:[6(kPgCHJz/D 03Fg^]1h٫8Tޯ2sօE5p`#<"Jq1*Lo@<\6c͕&pU[V"I&sHT5Gc$۬J` \<xrV_Äϵ:7ܻu乮hMwE)1/i!0e@*kwuLLjH)F"P =SIEhO cٳV k͋ID ' 3h5TiOa 1 y  ֖W"= ^m..9a|)MV^2ie<=j*[]` k 8uTvFՈik si`eh%G+s(J[2j%DJEu[N[TuLr抩˔ lhBA+1g*i,s J W4^i2U"JctN3`cȳBnTzr%Ōq2M!+v |J <<(ZG@⦪jI#sx+ŘE1k`h&|D`P*c*ʦ)IUMkT66% je כ*2d @P"i\MEsF%R92rJcj Ve^_@8ʅV!ER+ٕH22ti)ըZQ@IBPD p BHX\jEQ6UW+!BINc`hF[4,pZxZ댉pȀg:c_֝Z{ފ~24cVGr"Y1ƀ/%v!&x? w̰āA#AB 3.x 49ac[-ky"AT0hƋA G\ZU7!3JR:Z@L.HUzD>L8&8dgjcRk5:/Q%(x$Hf#'2HeV*IA!R*>Zo"A)ЭLH "3`G[pSƂٵ"B;?Y?ON_ŸӬ(al2JRp `9N_?87ƝCLm]LՐhE1h6vԙ94-^].-ZyH"Z0y.e}tbQV]k)2 h=k= OކFL65-xWVB.ds]e"^+f @b- FE;N @A&e#s쟺NsҮ6Fu$frP%F$hkRPƀ)ړd?a,pkaE^Y` E*C DYl9i?r!]kˢ9 a,X#4@W jPrVkYᴍ-Xu^!@̤A@ |^3&%Tan \u0q sFhP Oxߦrs 1qY"k&nj!f=0p >5{ISPv`)jY:j 5$AZ\@Gʪs7z -X#| _;wàD,T@s1#%F1BK6s mZ%\ Ec˄Lԃ (0 DYZ# RЃ}X ַެGdXS46I#*dO+Ip xrpŬ$r iN: PA(cx)U,JGIUk00g b:E@2=Bf6,DHϊ 0`ci #72H ֬T gF (tϤXH#\5%i\4-k$G_ M>ƛZ#@KvGVypk ^m,|taY jC/jC iDF:|%KxV)j2z8bm -C3zR [0lx @p4_miV 9$bUtd"|٥#hRN#!2]XrI ShI҈56B,?u+z3!z c P Gӥ^ѤN߭ōb_j%a g[Z%gpϟ_~_o~s ˲P!_z?o|=,!r]7Mh 2<{ p 1ėnY;g7Q.hq|!TTYn{:_|ɅˇyoZV8=ZlN%L\spRz4/`oPo7穔d1O$kݏT^Pt:^^/Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:_P0ڛ$ԁu7:֛^ g u~Bd m4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4*q.nP 7:da7ECگ^CV^?5HíT uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP8w:cDUu`_PV:ք:GpŠ&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iBFw>+$Vz>T~{ttM*%ϗadq Q]u7@tB~" 0f+ U~xOݷۏ=V(o--]4Aq a Ӭ9zuogϺtwAX|by?̀xrV4/<83_.Ůլ2f=8cm. AFCL{:u R۲2Kpl,Jw}apYY°k3pÙYӕkhܤ0F;SԀO/$n/ϻrn;,D39'/~?ٿnϨv?dY[Ţu淺Ln8Z]Wiwj\m~aGi}q%PF#;W3kP/CK"G׮ίVP]{o/__@E7TeG_{)XpgyzJ}4&R|\a/{okw 'm; WwU 4 n^kxl@V#/׏ƽً7~7?͆Sx:q:p3gtN[ڞM4og?n5.?ۍ2Ne><{S~9.'GŻ//rTOG1oLF㿾EtJ]-] ^MO_̟Mz}a ;[Giơ3E^/G }4w_6jR/_=r2ԩz9'z "K]Xt4)舍. Ң+ >zQfw|,'>1q_trZS-UT 0`p<|w'Ϟ?޻{[byv|8M4݋k?_dY:)a2mJ| ];ncv=a|TU2v}]'AiZ?,_\rl}wv?[ߟmk@P"e,Lm#M-G|c$آS޲o43'g߯g,c-ˣV{.vBLdMLͅqs9h`2}XۯgNG),5: O|(۶w7sNk$n?b@کܦ3/m[BzmFoUz*XTAsp5_ѫ@&"BeaRJ"9v,h{[N0$N/|$Phԅ4²lYG@~4>n0G Jȗ| yr"9 CFN1ْJF>\`rCVL"HA X I ҕHW05 ! h⨺8Ţ@ 1`GZ{}5Hu.뎺fG ?liۿ֔hL_?ߎ~ Gz{ڗ_G4BJzczJ@jcPIwn_,.4aMhݽ02lK #K"œž,T󑟋23|%`{Z1Q=Q[U4Byϰ>e֚S1QF3 E`tQLèeZJi i +0~^ zơQ☡b.Þm,e6˜{;BA9ub*9W-:#O}+Eֈ،-`-xՍJ!D@eܪrXE_)?;:ר9;Nuӎ23pPUPWr,_Weq+!카V9q%StB؍ss[pi<^? >g-;"dлyֆ jSI U1PM>[cM:A$G^!8&x -E@省n@"5kŁk9Tapׅa>)CTW "F!6`*_yDHlkvB~.\*I ??>ߘ 6//&iɟlY_FKozu)_}~uZ{uNѱS>lMOټM 6FO'*ZOPWT3˂w- 9rGRGFe5JU`!#J*9֫6VZYٜJ=5n)r5Հ!y#o‡8 Fi^b|7TzEHds0_+'j\`GNSGn+~0<'ϣrqԨ6:qbt\uNZseo;t.@M@Guݦ."w5 aoݥ[/U4E^g2(M-x0<0 P21Vаȵ}Q6:x.4Zo͏Suutv윟 6:M}^cI(aKGgsjr "70-`o;nh~?i.E> 3[bâ.Q`q3,`X9hsǿuZlm!ڇg][xo{bh}BȊ: g0 R`/jvj-iIU'9x-u>4ڭݴ4Cu=D6_66aG0<â:c"!v_@Qy.%z{Ra jǹ]o7/IUo7[ݣǽNJfXCECmA.®T3Aabkؠ>Źeȋm`u4-lpz 0PIA@Y D_E_9nڢFE{x,yXIŠXB-wdnVsskć],"\P,u\>b>L PsaZ+`ʜ$?w~sQoQ~c(WoȾ^kڣy4mvf'})C3u/%@$X/ͱ] 8|sMyؖVf:iPhmRW3N ߴ\ݶEkPaDo="f%x.Wuv̇c(A4 .o7ް+HGnaR"X }"rEF $ ,1"v]\y5\  eu5oP30!}n-a՛3Ldh<ȴU|ජe_G xEGf"1M6V05t$MeF-cHT"p^"iXR=9.TX&ȄRă=LC'TyeI c*,TnwNRRV,k,^T{uAAFu.IkG4G, j6]NwYZfƺFDLzɲ=+g{ޚ;RA14-,JiJxE=BĀ/$lh7~J|W$fgcg٬6"aNAB|`c- ('=Y[v?3N,p4C&<ʹSh|nϝԩ{v[gL=xGGûdI ̫aVl#ݥweٽ8n]nGrS; bd:Rr(͖?}le %jU4m>z4Z鰎{#]]]脛`XC`,4ScM=Sdw*DM/ L4ǝ(lg+Nzqt.v|^E6&Ud9Ȣ?z4`UʟU^n灔;7N>-NœojVSRrNOޝ `mqv "-bjR<['1}PIoikDWj^zw‡݀%M #}jEc0=lqWC>df!|LÏY9sUl_ggM+6tm*9lnÛ-iOP^_{ʖ:*|*rTD@LP(󊸕,4b{] `3]xB5_)5}꺎/r?(\!n{3]^a!`$C!|&׷.rdEumTȧyG5PR=Dbz%:ֳ'>~q ndiⱋP ,a,>~؏Y\.erY\|Dfl%s2ly=&ukg#Amom9s 2F&ϟ/Tԁ7Z/^H /F7,߾ʯQgh}x8^+)B(ZF~PF/^l.B/~)qvJ^[FY9\L֛_jg3}{_E èbUC_E,ouT(ꕲ%}%`J3EɂBi/$F_wuZˮc6^n 5mXprU[sBs%.ؠ)-g /2UP|abkkE*?ݕ<!l9i4OWxbŬ}I#?mə+ m}Y&vΛV[eO3{H'FKL'hٚ==wA+څ {#ۆj:DWcR fi!E"Rڈu]}69.QCC0ܤ7t}7#ǞŰnȭv*1'0WYCYCj(+k&mjƵBAT$e22oP!=g U\^uaivӎnA]I.e#yyZynjVAC|;fƵ Ah3Gex֢1QAq\$ j"K7ŽD~h@L0C)啘1* 0IXe89J≟UkIqkUDƣ$GO=?'m"V$e;NJtf0kP`Oz51%ĜыAgggg+ޖWL~E?1 !m|I9dFAQOx0޴IN ܿ!M"q/ʮzso<@ɝP}Pe%?d3]|aPZkǼx/6pA˚ZFVNM0QW}!ln尭5u%)Spw)+=p,&5xv 4N}$2{):b G| #Zĺ~t-V='-!pHNJOW¬{<.B:cg2QS m&^9=ϬfQgԣbSOs.jNEg Ze,>UƼpF*Lgo/#{!5jkat=}T4jB6x1h9衐%2bGB-VafpsCQ:OzB;9Nl^Jz=a)%kSZ=Ķr3ر\=jZ(1є(Cm(`V3T1rN6 ?ʃTxE>T=y 4R/:h@nyA'[vg9nMR ^;5x$xw1ǣvLȉyED0Z1%&?Hs/9ɩ«6%8#puEb8F}S$:(d~t+S{-J('G+UbQϮxbZDUacwX?2ukx盇|m0k[\TӉ5߂̵"Q[0 =({1}t߱t70 c"LN{uSC.n.oQ12qsaTRF2;ڧ)ª"jꪋmrj-T b D _D RٻmeWPs2 4qn2m3'C×^|8zg 3_=pxOY>~wlzܐ;)qfÃjVZ􈼎O Ȝ )T3y 4d?`]9FϹyWo#!3-?;d>ȉCS ߾~睿 :';ǛИLZݍoqԌҋfU3uk ׿\B`<>9k7p ek87Ⓙ K~5YJ7)l(|D:ڲJ0S:5="?iSJ,Nd1! C]=3eBca|g `5Ȋ?:w6TGq> &}ip eqSa1\5bnds j}ps$s{ofA\!n@b 4Gێ(?EO]L0%M@UQw;vܲ,҅NSd>cmXAag 5Hm~Cm`z/؟Owl ߘqS6Άm~nx wS4m855[woZ~Ý[tmʙMУ^0wYa- fbm΄U韖lg/gF . .`ZK&E@NwEO1nATu `p >Wpnw3Pgy 9ә#O9BP$ ں-Iqv -~\u+>i%<}E5Yi0dyqS&"=1Jnޒ-[>8, E5*b4G {q8F30ggQ,Lt#qhNe0,@閅XE .[`"ӵ :hDǴDBD(,M~D2\|xڵv!r99v~B]}~4{盦5NU+ڭԨ֜(?_^pS#W!J}Z^͏i]&{IjqmV^eu+í:+lhf+|y% };> =7ztr:~"Ta@&DԒdcurk/׭fwb]ꮵ0=_])[}j 7mZ7}fƖz͝ԛ UE- %4ZbPWESTp[^V߭f.V1[żeYϺ ڽejbJ1 :qI' ĴSF]kJm6wƒhdCm6uxrpԌZ+w26ZKRo+@O.oR3nMw`c@1}̮hQ1&M-gkʜ澴a=/TlY?%5+dž<wI&2a"=7mb_yP " B5!ǕvҲ\8v[WȎll/-rmޖd]Ĭ%ٺQ=o.ONtE eP":cEИ<Ğ8-lD׾ɡծkNFc$諈 7G6X]lvQ/'zlڀ@R "|BbT2iZtE424b2h,8hg/HÒXIz`*L(Aq/.R'v [ biuձ/H". iE ^9T$0<Y UT ZjZi7iZn@)$@IDENp~v}»@N+1Д+ܮ_?޲7][ ۊAm%k9v .-["ϧ7wprtzU,4;"&sOq#KhPF e$~/-,2ҫ{^tNla$ B3&\Miy&ڀN'$ЂF ݞzwϏCV0Y='mjh=f}g~H\T֜Wt[<÷Hl)CzcHlv^u?~ 040t 8pd\o& u Vj(v:8χ7%nm}m gR؝5Ͼ;,Ӈ϶?ZV/Ls=9)B㳏lAg!Ylo]官GCvsAx N($Uҧ0<5D84;Q,nH#bG2Rx_V0|"O>ї_x2!}-%vOv[k r$(4mҟzw.Q`fC;+,G<Q㈧$:@ :иǴTwHkP 'tF+X1on7K56lm3Kfhݔ@pF ndg=j0~=sݪnroV FB0 cP@~ LQp.0nɨPnyܶ!tTA(s/6 vѧ2A13y%:ru[n+'r`&.@t C r}0޶-*VaME~sxuD28Ww.T |&Ie*񅉂P313xE$ kR1P6I"7q09L@|FႣz $ jU+$A>6ј&"  y4Ġ!j_wqG+P Ǧ-þWOtbk͂ H0?713|¨hj$IRdN!Tl+`2! &6եE>_ּɂ3I Q p]s;CsaՉ̜E'<^sMLAI/VkdlU< W77 ?i骣beG|c' F'bO֐=kvyoowt6[|vW0OছQ/+=U/4RLPJ7OqrO{yio<N rc|cޤ1g~}DӾ _G6'3ujQeM4\^YU5~u5fVXA^ę|]MuG+>ΟZ}wuGVZs]aޔ: >/¢د`zxd/1,T+ e ?um~:<:>;ՇG<_t:FFJQE|o܄5g4RUSTP_Yu bKi櫭U T|$ǿ?LaR~yI\#"%vkW g{$݌UuȌj`, 1X?49XWbL["CUq\,XX]>_GĔp$ .J-a0Xѧ?.]𣪅' >gFhW+4"o*i4 MyH'x%/cCI`L۩&V#n;5t#EzVT?^Ţ=cں֤tI}tڀs8p(ypX0C_&}FQ c(1`u(6T>c2G}sz+xO]9&<DŽ}o3qi8Xx[f 1.94g~<ӛp9O|^Ig($+0eNYl>+9JYq4ơ+p |(0bYLjd<)L^jʈhA #(p2&kS*>4=ƩTftVR& 1PM&pI+KFMsFV;|tY899~2qm0:, )2Fz?ҷQdfHgpOn` T8~;&Õn~%>jx Bד2?nvs$Qi0F&J #Ko)lp '+wWV0Yaų5z>pBp!) >*wWHm>'y1-Q6[@K$9Q4h]owB+6w~r_0_ݢN̮'i- Xԉ8CEU@E7Pcq;,#Rkxv/='P%֕-A77_(^Dc4NrTjӌ#G6{B'ExBjyehX)Ql=N;[B7zf֕Jf ^؋ XO}) MGa~>$Ҵny͞ڕi:a'du Ux9(ٶ EI0>YSJiLy"L;D}%g" e"0b+jy}8> x$TVa S띱Vc&^ˈiDk҉nDO|kJ:qNg4@v02T6v;K|);ߣÅNgm6֡MHѕ4zo`úӾCkOQz{0NmvdYu-D =Znx(wo颭quVpus[_H4?oǎUMeQ ng@0%*l9|&l9|5!I1M9;M>gs6M>gs6M/\-Wxr\-Wxr\-'qrL[+ o[+ o[+ o[+ o٨r\-Wxr!KN>!n|ma%*xS4|=)ƃ J!<`_LԺlg|A8̓16aHuI4LhQraa[]TSd|AJJ bT]\#{Z'}),s}g5]p$k_NhoiWb/Y)5Iz9N`.0 $- T)JY*w{.\N*`" B}T܀pQ>RLKGm7 ,3{a뗢2w]+u=<͆ɯ 3?Or2*0!UII%$@tY$T:"K'U2`r ka iU3@ĠL1hL>N@kC:_#1}poNc/Ǵe)$8XG!1z ",iP%LR#1s:uG3eN˜vHN;i;St߶7ocKMS)s0_,={K)#G b6pd-6([bS޴2oj3~)xl9&r60VFudXDKIK-J))dI#ݙ^DNQԐ0~M UXe>DW[9#*ӭU3L(s:=kT\z@]15y=f'.]jLe,:5E9Sb`i/"brQ$<9Aeܼ7? ڇV!KXAO;1SVN L.P3f4j|*#,R.Tk%i ʮk+@oJ0 aL~ P .(9K >TyI%Ϯ'Zzm~}?:Օ; pޥz\f٭V9AKMOfV\t o֤ҁ״{@>CHIp'_\=_rH*LDnb$`<"1dI(I`u\XܙG0-χh  {S2Y+*5eDDL Q4`KJeL,uoGvS;8gdp|ݹf5»oZL#s=PixJ*WŊ8y-Xhm 4wFm6*GT KIw^"7 QqTM,,4:VFf#2ʃjhx0LQH6}ĬHn?bKPcΆc|-u:9]z_+3}3}Оiy:h&pmwP,jS'?q_ !c͘aHn|3je;VTFWC[29ԽDWx*(( -F .(Ϯ|+<֔`lx&z{r񃞛 u$P7>s* ?NÏ%)g2cOJ"xp0J7bi1g;Jaظ^tHV{N[:Vg$գ4C1=i&z6LeZgdSsK&1?ҷQdoJgpOn`FU8~;K &Õn~  =ߏ>r?^OK?~򰣝ǍfDIad-J Vxe8 ݟp6ξm Pp!!. / /nZSmk QObZ+vtIFJrhK*Pԝm\E_0_by;̮'i- XZzb*qv0D>ƛ_OCkh`U꫿N0>*H+PR4ؕ.s=(~V/΄2 s"5'Hw.d#m1M?5z^$gztv9 vS {?'`NA?>12LKe|s|s >33A6=8iM&bny\7/7 Z: tC+ 0(EMᘷ'IWr 2a%Z.\:TO,c!^C0ke7^E7l!!rFHB! E.߂>L9N7:/9'G39u*Px(G;i8jm@R/$NǯKeȝ޵Y7);,vݼj;%ߊ.³@c~W!/fJcid%G6(`)$93$s %`@lJaJfҥ%68R$#Z{Q*Oc.*51V )dLV{/_5 \ H;f3U*)QYc)ޏwLaZ\%c'1Xrw3+L"t)+ZŸ_+.UAW$oxzmA߼)^ `ϭMk4{Hm(v/ ~)w0YZElT$0AB% +Qw$_ i?)M\ RxGt&.VeO$D*g$:10 F+5j '1beP\N5I R2]%y[%6WǬ[v_sv/]My:F M(XvJ7blܬ4p࿹yg+p SRJ7ioU2wj%~ڤw6uu=ly{їUڮuMxz?drLPCs4 M("zGgׇSaqB4`~:Ep:$]i$Y{I%nkK;L\KCQT#a5y)szf9Zx[d]\~⣖+6xC myJ^RrѱcEΒF |AX"|ډZ;i8v &u,"f4P&zG V5qE\-U­r4uMгnsIIyB/F>a$E @# Ӳ)!iOFwIU2ڧ ɓ,B(Ums[eC]kz2dۤ# uX6Eh=P}~WdN&kG* &kcYUhEO@b^b hL4=߻p1kǑ?{ȑbr#] l}Y1K<u=UÇFԐ!XX͞W5y_鄽Y rx&Ute!RKUo!R q* )Bz y̢KސYlY-!`"Uh"daY EooFU;]Z^ڡ2t%+ۂlEy0H)Adu=cH >[l/q be0Z4*ʼnsµm-`2<~÷,B MYP)y?(>SkyJSEUߨmukfLۋZV+xأ4 մJ1j*+iTK̺uwxA]\K>9*(zb#\Mh{( !LY|+7tp=]JKx++.4ρRkXzT*0yzݳW~_ )IE3H<18~qD'V$ giA|d|80㻗//|>BnmwnqY\d28CgehAS*svt7ߞ]s. 5_'xF2peFd-wUܾ~ݳYkjh.9оV~D GNR#qo r ]!Z{BJBOzDWX3p7ӨVtu=t=`A7tp-}+D)]]#]iCOS[o ]!ZNWRFz-=+x? e3m]tu=tezI\>r){^_x|2}ūk|,r?Y\[>Mnc(ztXrXáY9 #hOr<"Jf%!F+{Uv5"fKⲻ[9מ IբlCDOxDWXo J ]!ZeNWBL++fYbBFBWwBTBVH*=+,++/th5;]!JC]]!] (BBWVe\:]Ikk K¥xW>D"]]#]iBYO KWCW]!ZAD]]%]Y*3׵yDyD>~ؗItFJjbED4Il"3,3$ѱٗ\4-e< gs~Ŗ>*Tn?_2jrLHQ@i-#+,,Ns'|͊iqhy(L %70, nMH Y_P@ U(`SCFV ClF92uGWXpo Jo *wB}0YSO  ]!ܳ޽}]!ZfNW@WWHWj>]!\m}+DkE P2k+IʻBҠ\{+ ڡeP@Wg+E# r ]!ZNW%ʰJcF;]`F7 =5OWR@WWHWSDW3pq#BS툒@WWHWIB}B  ]ZJd Q2k(zvCfU+ie`;gS]@W2hYy+D+U Q* gǻ\C{mP힋@Wg+ι!Bԛ`rwBBB\j)<+lX \ K * ]!KmjҨ@WWHWZIjGt]!Z{BB2J ]`+˽ FDٷky] ;6BBWV]!JЕh(zqC4sW_|#C;U;BCizMT+`k+\,C<Pt4nYpujv~x-ny\#gC}UǏAzwaV?2-cZ1$Q',"9LTJa t# vȋBP/qbyvFc0 wٶ"e>|~be{~-[\۠vu A4wW qisK ~ԓ_^`Mfj~nEZRQWQ2h}3dO~*+sU=>Tprl3 ҸpQn'4zˆ@ߊIuǥs.#ε*B4,I4Ǵ҄lݞYY"Cn?cx=?s_٢n> ?ѨV?߻Qt8!!q$ ܻ`  / \|o1{EǟǓ/+X8U_Єr[>+ _ipӿ`3ZwfOi)|s؅n\x>@}&ԽXwg|OO?j{ u}(G\]ya~nލtϪI1? baë7 /t`-*?|ZRowV ~ +wXQ`N/;.Aprd#yy|8g!{(ⱋViUOW3Eqdub#ETZf,xx5ޏIz;pF~?1 >_NX,bk=zK>+Y1KxM*2.$siS9eVX0D&sb[lqE$Wev>*nW׏\8ƾPkFg];3'E.E r Y  Lɘ$T'T 屠qʺ0- ̪YV}B܈*J18ɒA?6{tjo|e8[eGe|Ƴ?%W)qHhFȀl Kxa/!{ݧYy@<۠Ƀzڃ ׽[읜qg/' S2gP\ʓ,R!e5gV$<2Q:a;{p{o;AN%T6u27~\R:Jk%7FP,њ =:a7ϾM'mvӫݺ{! h^N/jܻQTJ̋ܗRJw<$!F@.&9e`.5:m²e&E]1R9|'fVơ`]-׋Gprg๽[G_he4,o @j1“qNH41%LX'ST.8`# <F7daT:N#S*kN`L#bTX F$eFRp.,lN`coƬ`Z2\i3-XdIjSNQ͝ISpgF2xtcY3tlqJx^usn½̇6 x$;]bB!Є1V',0sD1Tk%it)wLXq Y tdux5t:ڇcbS.Ѷ\U,-\W-gb=2L:,W~뷾jeL,܊&i,5Dx%<t"-8>tKK80w_(_]Ma+f_'tKH;r4Uh5Bk(3zlSJRf[P#Y7j i}~. "?QmctFr#Ss}tOЧBB[JL,0"t&.^Gd[nT[}vLl3(vsQ&̦98{)|ż0a|{o.v*V_HGI&Ι!=eJg?.~.~se a]ŸTKcm zeb+g{IqrQeZ@kC!fR%#/gNR MsOU#͜˚9U7bY|ڌ!sxGgJUMd^<&_c}skiEG>IE{}И>h!qf<`1X<^_6t"9/6N'1(,vJ0;8Ikc^f@C:bõpA1=Wɗ˃;._ZFʥc<5;-_<m=!4IО_ A3BTA6V 'ٟNg'of&M{WNS@{?E n_~zWJ=~ :L Z`qlB8S?89˹o2%́;l-^G+rdu_߇127N [5DyMy"E9F*m=T&BbTqA5s( T^u v7fP'ث)#Rx"%)̶fJBh%10#u kgA| }vñppyMֹZG)ISFDPJS̵91#&aޤ&F%B;Ht۾:ii 2+$. ,BXg)#ǁuTr/c8MC8_M|3Wg>)DžS ~.B$y\pȅG/ʧ~I£n^ atmr|0]IJa%x?OLw~'KokIKг&?Ɯ=_]ϝ)RŇejUN_ִYӆ B.JehMQz\ervWSa^?]|YߖEj77lӟv?cU$ueM/骪R]ՔLFA^ +y;ѬUoq4I#jlh֥'s$TA}>P+S|AyB(TN#׻o?;Wwoϟzw:wêػ]!Arg~I"vNMkjoR5Ul%Tjuftћȅ(/&)?QtgX@B|(|$JzhECJXj~ (MDE_WQ5* D؎T]ecL_5l:.i11+4Ä>[݀lnWm}2K?+ý&(q@hNj`}y0$gQ`xxgJp1n@Pl*C"[lΨo l;D$_ȃE:ӄJt.A)ˍH{0T'0`:6XbYmB 5շr1ҘN9FPM场%ĴnnUP$'w>`RT\ht8йv7RB~D&FQ s[(ژTI8D0),ʣ41Ho6ssꮰtWXjm^vݍ-e7ZʋmG!͐n\xkw}&ɋWxc&1#L"])s&vWtaoĨ%O.!KA<9BVGF׸-8>7ǫ0VR& 7MX&HSE57V"FMj}u"4ˑ|&2 ddvξXFh; :?ӨM8?P2}u3/Mij1JL3k̓#vE?1X9B>pN V)nsV^Q` 6aĂI 3Bs|"hʻTUMOcӱǫc% GIuA>E _ӠPGJRr_nZi.<舎2(ilv1X1&M>.-9>>zo8N*7 UX{XMVc<&)J-SLԚp[!- NKk&r]}8Ŷ7ÐRv5 ,,㧃Dg*2=Q4ZuU]{!+3bI_St/zp=ʄ^Ym!W6 @v56C_CĀ. t:=Yq;5NYgWfV Ûq]սwJ"NdKoևиՆutYk .| 9Ѷw9iպw*y fܧEN;ݢ q(rXքQq)bm () (ae*EpKLZWA\WAZwpupFMI*JZ@"̝!]q֖Y2Dw>NԓM`fjf-t7zF,D"6:e5K0a&i3K~ .ש*l'*=:g8y`Qؖv?K_k͚Ri#׬jhhDhΩy{ΩqEk6~wRI9Fפ&('||z:pㅂ'!ͦZvKe$({[R#|\7"gfܞDr쬅EדQZr;^fx\Qd\ &ceZ:ujUg|5vycW'F\Q򱛏E>Z <T#[[0JL%p(S>mƹI$gg wU|<ݔvxߩ%ڛms3u?jDq?`&X5~kg-wݷP_s?5M)!M\ %i*T˦eKۻZcE]P:dXQ[r$KWI‡2u+>TaP>I&%ڳa)G%x#!@"dlߣמ*dS#ZA j&PXlqukӀ/+)K:m}z~֥.[Q6;:eFCf>?0K4=6ݠa dm+krxPy" M)p;?Dk=њިt"4y( 4BiNY6tJSsoH1.SV2ɜe8Yّ ZR#[IȖow#L `UFcEܟ~C`p=$ʥ}vb{g.Òq`Lq .䃍ݰ7~] 6W?tͤvq3):,5;C"L\:\)8{WWH`Pk*{Qxw WAJ;:Bb[WA`vU-pR$38Un&CGHڽ{v %1•@Z6,j \q9j \i>t R*•S" +gg0HKU ,m \qIk\AZz %\!\i iA h J W eyCÕ9a n$0j3q:;fҲ #vp%:v1&i\n \q5o \C eIg;:"!,j\L\IWAZ}pR*17$bP`0eNsE,E$5;cc_4)3Tyص @_fFxQߤ ѕ& ] \Еtt+JWCW%[RW8Еa+t%h~;] ʬ*nЕ+AݡӕLY*– 5+VJ|t%(c+Lm0m.oF] ?w|G,_pΕA7AЦWWt1U ѕ۹2nf!JP2CWġW&K/op~S{uM8}UV tEJWz{[Veٯ]v3߾y=q iX6.D6A>!̓Kx-^ime62~Cj9png_b[*\L-p_z5~hz-(#i}% lm0Е͜Zg̡ӕ^Gc}] ` y+tJ0BLyCt"Bo.ǭU;`C+A(*@vKŠ~fқUnzě+sW@_] J딮(r-ΩvVJо}.C'Co0Е͛Ylӕtz*'] ༝sWuK̩vAJP>XJW[+~+"ۗWW~sW_h/]X1{+]=wmJg?GDW֦C+AHEg7DWb ] BWáӕ̬tu t }2?t`G1'|si1*: s(7݁'o߷ofs:qۃkwDO"zH:K`?vnw/{S/[)||nD ֲ1;4^\^!XsJ!]0xʇn~^vofg/1i`~7?K~? w]ԞL'֜Fj|;BVǃx_;*iay@}* (1薅yxBx^/ w?f|d0?} 6CGCrcϯ._qǏ/ۇ+s Ŀ^+.75On{yh67um)Ŧfxo]o 9<#Eo^]#]9E30PWg} n(b1s<+w SsubBM5bklc܃zs]-5X\ɢ miޛҍO7Rֽv&J{Y^lG#5_煆0}jtsmD +TK0zCK@E!5bҚhi2"Ա61J6a3Mge֒}ɱ&.-Eײه}]j)挱|Bw3@W(8(8l(Ø3$B#KO 115E4CcRFܜBZhfpbuJ7@x$-9 ~qus6F_h턀:clڈM<5ˊ xZ5 J4PGPy՘jҕrWz 6c/m̭i|N.DIU@6`63a;7;84~p`THHJlB6}jvC*5D@ƑDet(PhteX|.H13 v[ʱؽ6ʈ6Ѐ,`6pj(\_dG 1[TPvF;}tUd75V$E3$uC Ave-@fH57]O_ [bet;tn(Sѝ5E4GRF[@a՞ @Q 2"}_ (i(HRE*&2H]{D5ڽ(i E"c5Vr#@BC %H -JĈA ݚ` dw)#*D" x!ǧcT4eYȠD1?J_ݡŊ4d9c6CFr|R1PT yi"0E!NȘ9׀E;OvߙxⴝWrZ|8W?Z^Lzì̧g]`Ilm&gV b; H:p2 1CБ]J9Tk%ŃZ2C6a1%A2H(ª`QBWEJ@J$jC2ke(.Lrgx&TW]de[g[ߤx$DfH64zY*â{wь<&{kuhAb; edM^#AvG< \T&/Ӫ1[z̓ ӥ"&b[mȎhCMFD "_ %_;)z+-, r> B7!(BD`'cW&􏣻^Zzyysp69"Ե§8:TV2dNf~^r3?qoxS2Qj-=嗷N||ɟ/n%>^A9[./vrۤW6g}\ܜ~ 0 z|7'yy}Ozws+&C4YVx柭Ogx>C#x=7paKBy;F$f:@R7 j9JNƏuԨF5QG:jQuԨF5QG:jQuԨF5QG:jQuԨF5QG:jQuԨF5QG:jQuԨF5QG:k٭CےQ\ri;F}C~0AuѨCPUF5QG:jQuԨF5QG:jQuԨF5QG:jQuԨF5QG:jQuԨF5QG:jQuԨF5QG:jQuKdc7c-BoJVQuP'F5QG:jQuԨF5QG:jQuԨF5QG:jQuԨF5QG:jQuԨF5QG:jQuԨF5QG:jQu#oɨ *u׻u6ă7ej9FQG:jQuԨF5QG:jQuԨF5QG:jQuԨF5QG:jQuԨF5QG:jQuԨF5QG:jQuԨs8F߯|O/_NBVJwõwѯoNj{4bڐHSތH2oWK\}E@ |E`IvCt%3m+@I:a+$.rhi{N-\¿N~_o>^},PmOE?OvrnQg٩aJaVz<j~loM?y$Dҽ\N1\oY*611>*@a.nfDiv=סxTu9RW/-nc[ۍ^}|}Y驌K^};Q\kM.{ݝ-iZ~>k4?6v6&w=}58w#d6xmP;ǭFLy;G22>[0D/I>UQMw{+ź]kݘak ضM^ZR/ oW>ԔT$ڏ΢(SIUI(~ݸk:կp^\>3vxڻP\1LQeDƻKULtsq*ʟL`nT4USa'Źk-ZuYut^t̿m\tuRAXrp>·$`)Rz=/^|I Q$x`/Hn!8IW/l98Y"A cvy :0.T=NMbQrFjCuU, I'{P^i뜧ffAQ PsZv R:zP XqeTO"4I ̀t@b>W/ +`slH A@L>byE #D2Hirbf߳:?}G3msf%t!V?#!)uc཮CaԥJ4^R\8?:Jy%;3NNrGI^1jn)TQRZyѫQpͻ܏.}FQ,OdPǮ؏:î~\>X>&&ǰ{jT{h+W[q|^ S7!MI"ߣN|0/"9B=ivRe׵VPN#dq||G < iW\}*đ{>ߎ$;N;RiϏݲ:&E@\{6:$V=+E Yt)oޅxC%G#b:A,Luc;'PrI<~){/eG 3c3暊:*6 %z,̦M?(Z O[q팋S8*ba+6IJ7Ѯ@q^`_ T/fYGƏLwZSv*SXfF/I\?>ҵP݁Jdذ6X`X l({2ם+,i!ΖBx]韻wzq!>w胄US3]&}> 5k=NPouk fvtS{}ͣ^lȘ_' q[jXT;icǂ8>[4~*GENyr!Ǜ1L{m>V&Tw?th5Hc19uLD|Ķ6rIh'liOcr-=~[p;Exbftb.kwzySxzƯ`^grՙ~D`uX.9q?Oag:[_>" L~4QZ7Z7_ioV;? _W4F C^Uff1b*<GFuaQ8kK KУ){v Vby@)= TXi/?Y\r- 7`6(LF]iJ,Qq$,Eqe<"#P4rz~Y{(I7vl{AAs=)LK╋˹ysbh7 %&1yYufY0 lo~zQ~b8g25~{xjȦ6(σbb]l}t(>)m{ǻ?j;ל֍tGk?Ujm4[ *EC#Ȇ]gm*wo,;>zI Ҫy '0J҅n,*ܻ>|;㽔bfR7Std7S^= = 0KZ#'"ОP 9se| uolU|Y,FK6n0#tegs< jƤNhxb_o{.w=Rڨ?y̾!JBIq`:Iq\T%)u-wٓ(6Z%yIqlR+ 2$?])[d++z9tFBt2tXW,z񚟮J˼ 8. ])dJtWR_"]_tS\RpYeA@ tP^ ]YNJ&^eJ*^eK"+zt9vjĮu:*tbtP֊^ ]QPU0Е[Rh[W ^"]1ǣ])V\Ϫ ])-;]i;"u˚D޺0AS9jj9jZ.zDFTvjR]Do=\J-9~=4y >^rRA >L *"vH`I sЁ722z㧿?3̮OOcj|CN=zL3F`~TOC0 tBę; /36zzm5g];\ǵ*^p՛7%W)x/_7O?~*OA ti\Ĕګ@iީڰaW7^/sGf8v|g89,gIG:|[ q -lYc[!K#m~46XkNء`yXE_ <` \c[rHD<ؽp>j=JOJCBZԧcc١ >gFv=v'AsR>xg A,B3 Z£}ll;tҨ(,0hJ`a; , +]9 ClcByUmFi<䐸>%>#L -m*\P p? mi+<]ҋ"A<|xYIl5r4EA,ji,S`b8iq!f(Au+p)Іf[ZrZ@;[ 1_|,Bp L}f$8a&@&CatA3iaѥݬj4 7jfC_˅߷xs6 yVxt6~=$D T PiZzt^JL%̷X ͂٥(sJ]bR]Q?=l]!k21N3-@]=zt,:3>pgB}vyy~\+bcn_e|V?j\ԷŇx/_.7:-7x60!nzH0 x]P[~)mG,__hr3%x^zsnRZe1={ K!u]efkxU7ߊ"\G/I_/ʀQ 8" >rCAi\MY8' ,fs)ۖ%t 1GJkI 0>pSJaJc! Gbg{:~=htxC%``b K;U&:3t-ǭ!j;6v&>}Vy8&_SqL,KIG01dh-).9]R}͚`0H́w9bGNE .$yap"~ict 霸p,9JII+mH[ؼ49x.iрf>6gJʜ\ ^SM܁ٛbj_Kc;R¸G%UM$`~&˖?vt+ھt*)QS t=T =# pիz7. vzn,AuoeV ̠ #t~BҙǕ:(6u(mN߳i!H0;{c~;T#ThaeTr upo'`=$-mLR] L_Xk, i#[YuY3?1-=;.;`:g.^>ξW֛! !0 ѥZv<81oM.>$%$%A ๵vߤᄍBoe9Oߎ|z}ЗĽ2j*3[mtϑ%./'D8 N<"hqfKR$Ib:o:6rToS]pN/<J.Wһŵ\0qRG@I_=X=a" wv& zqQȇUkpU'G ȃ#Ժ|q4/36/7R{uᬑ3.-.̉see-Y$ ޏ %8qcX{ z7ʿ?wۑW $ rm %`Z l#Ęq0m2||)o>w-wؽ9s nO_7FrP*0*~>-Gg:&UbY,ap|qXҶ MS(TES0ڍ^?:cm8԰B<|#Lp7W'Luz*]\Ku`s`3U4/ýտQy=ָߩYޞ7Gͳׯk'WMN9z+z3ޟ<5VyEw8e#Dd&@ eIRՔARZ2 ߴxp,3`׃4zU Ap'Szs)**%^bY0LIˉǢzÞ4߫OD=եmp:qDbY|i,/|#?ΧPjtr;0)6 +l/vx#y[iy$`aq]I,iԏsU&-*fք[Ƴ:yXy;‹;;Ѣ;; ŝ⅃5?柽v޺ @ì3Zh8?LLB@wWq%ofFWᲶpce=Xea Czdr>:)F)=OG^yDGzMM FKVǹ>ʑh0 7K@_,cMxn:FLv?,hУ.f<{9M7| }r`> ,RwF㭻a|>* 5'N]^n3 0֡yEG܌[>9\^ +GeBMgVSUGPWGV#6zڳooEtJްv147ljfW+ɾfm9J²e(G%TG?9N*"x{qq798sDZ~2zl\^F94Z}m{b[ȋ{v]dns!RXm0&teCg6m57{x y'q74ft\ow-araȻw;yN^;yv-8`i<9 +>RP[Vp΃ydOMEs.yY}jfZmrבn.q+lv}gfS P`a!e,JDkm}Q&6wʲ(͔So:_ӲpfAHβ y<ҷ.P}@ v|"mΈ'uxɜwLwxHtzpnAfJ,Ӽso;W^8l{a5+Nlj:A3]ٲ4]i-eQl2@wKz K|cYM}M4qI8[\ڎIZ`F4̀a2O>s9jxzJ_x]ru&&%+fN pYYa&jg"qBo,jM-2sEh{NsRGE6͂:zY;Z0s 焿7jc $=04M%LYS.W=NKt6OKbM#??Z`W9W}%H'WC([A [An+(R&ά߂SF[WӮNw{_KY Dy`&(o5M; ֘)Z3XY=745va_k^u?Q)rlK/2.i!)j,yf%]}fXt mRTas:7vOrmfb)UofhM{ۗ؈ .Y +kɲl Q]D)ҕiTge+Dklw(+Fu]!`i[e+DK~Q&V;z:t) Dt( ]!\. ]!Z)0]=Aknh6pJiDZ^niKC)s?ٗ}ؒ@3+Y~p0j;ڝovGAt<~.]WWHmx5j;vTף{}D}ި+$77 KHeUHC?O <⮫+$uz]WWH];BvnD]E.&6G Xpou]WWHe%.GWW"wnըuoigjT;vb+uxC[ .d'0,͎pF/#`n ~;:LX0Xm"}!\a+)\>ϱ+09p1:e֌؍rS:DЩeIj ~LGd3ы,86 ᫕1av` :G]$ћAӡf#OaeO}nMln_dRMJH,7KTo#*H3tF zOM> j8 .8([F/u m)}ɞu blcv=ϡqF\+;<AcyNnL[YFAn$oĒ7ϫ?;yb 2Kz-\>GDA|aA'_:?𸆌TǍ[۷/kM[~=JTCAsZ!Rfi |ҼY }ۛrS6IOțLF@QC3}E~B. x4yRkNAW@%"$fԀ̤tOI !!@sy2/m)CVZi X < (g RG6.WI-piP< sЂes ZvhilCI&@4&_L< ;ȿxd9<9LRЕx<)!EG1ASi3ih#8O{ =w!$xsnnG7C[l|46K{upHC?-Yp5S^-.>)YM"w!T]ıI75\檛$Ntppo#Z5Uβۛu!r zz(n.F9;ZV쁝;d] R\nLEA2+$ovz{ NZ[1Z[Y;M:-GKvulݱ/q"T7WR dg׺LU^(fmELT_{p7}b\mrz=#?A?Ki C>kuM ֢Ԧե˄+?gO8'~L)^*Xaw30FX(C-FH6-Si#MST]ͳ?4fЪ[z3:u帞#a:kz!5T:N17AZ'PlL2`4w,޳ (L2]hSUɢI씿1}jzzYZtd: wŗN*u{o F~Fo{F0@&fLG9X&~";,vR[3mx-ASAoacZrܨ^3OVi&cF׉9+$X0Uw9vĞžo1=K[ڎ=A~&*lȄz Hs};¸$.u b1TN1HUPOF.?B(m&% h hcb." D[E-0Ʒ0w2hNd}wkޙ?9Dۦd(?.u-;kg򨥇ն:Ju4YY魩aeu2W$e/oNAU1 λ8j@ t~|mu3S!n͂` /^y: rءMN']1r3Fl(7l:, YSFye4Y@fGa8fBiN 0 ߤ"ۡrd2}y \~CG"=ie[ jה8n`4)@`Mq PnlN(V\KG}/4AOJ.׫L=(J=J}v˹yOcgKN U2BʲT@~;TbrW,0V5+ܞ1]+NcGn0xY:`ԭ=K .jm[p$Cbl*|R6}`qL98u'fDS8ʲĐ>lD)?eb# h] :{E{cá v0w(Ac<]FdUm<ղ_\(2 Xuʁ74_rOZ_ςiF j8?FYPѼѿsihfFygt;M\}7j|sa ͆}N1wC02ˆΗb`q NJ{kubO2EU0;&D!<pۮ77j`9LZ}Af%Q4/M,v[˚3֌`Po,払B1g^@Xx!P|seGNiWP!ugRR4enAt˜{@Q]WGۉȶ#J ;N <&cOxəB3m̞Na7vNA|C˂Ihy6YWm}IbB ,DZ`S,Yfly|?E@6yuaLG4uC$Bq*ejQ;^dC#ON uHM=wdDrFW4ɵ*4IJ6sChu@i*\FءO9vBe9(lR Ր}א.JM2|hrfމZ8]ڋKKP b͵M`6wUQA=JqC7lXh1WJDj|<[svt ]T\g,@0xY;/q^9N d=&]uwwbz"ŨtW*o@B'Pqrjɩk=*\ QB5)U\RRHz~Ҟ?A5DpmVN*e4z6yUFՕ0I 'jxe?Ӣԍml&e~An(g79(W:n$'xΐT yel,AlF1Sْy-EbWpZ nY#|kwW//k}x<*_$nDW"q\ 29E g8/9a>Fk6f*Rmiɹѩw)`[FDΜ{cZ>b^MA% *Yc|y`ȓ`߄{K \zZgj{8_2R0Y'{.qk*qE -߯zfH"Hdzj=1_aY ߝN? r/CNK bf !j}I @ s>뫫W̜oΦTxG9FJv* JBh%1"AyWB]P0 @;.)6ӟc{A)k+9r`3b<C߂"plQ)+ a0|COB~?Ҵ"9- V !UE sE HZ-I;K9N|W!~pG82l C6s?#e  }j!|k.Q%if}aIEe;O2ƅM>"W]IA|)wUǪxOي1AErD#H}F"rtN%V#So)Q~6[  85iat&j.\\2p'u4hZ^Tŧy86>d9+/J:7٥)%w3ֿ}{<\JbS8r[&g{MWV&'T9L~ʖީ.Uųk3_KwݯՓog]aυ0_8U}Km.8`~1h޹0zjjIƖ_ES3-Y,A}KLVhx3`˜JVg/j nH!a#aKU__1Kc|qyRnTl0gnpy/}>?ɷ1c(Ʋ}m~Ҵ1, *0w@hd+Q+/=Z}8p((O3+TZ}m!JsB eG pLž*GW"J}/Ƴ%x(`+ eZ3$."﵌FM&Yr܊tIk.ϖ `ȏ)7 U lάuO6\@qEןQ֙.K=U+U/em,BnPLB5UZUqb.-{4Ρtt^\=t稸yOBmͲ)zn_SmGZyd077{ّwySq:YzVzr>5g?+ot2ϙR|.R,ځ!hݺ!^ün]L~}mNis5>gƸD"-אRJ/rhreqc*h%{g ! [IWuqoog a1 &͌'pRQ&&ss? =m.Ӡ*"IzR* h;_˹_vW1lFrLG3Gqy# vg=3 F O2b;^2;eIoCq|8>=MX$ӗ$@=&3Y~ᶑc௮$bvܳ3؆ @&fZ~y>Hs7^qEL)߿l0 XM7{7XH>]cX{))JHDK9 Q ƑVJ^AVsܼFζխT-(q@Din`Kgtce^/7{5sf+)ekh^bJJ1D9)ƲiIm6>ն K:][<{'[qHHO\cߛ: z8E4tLE{·K|Zt6.'Z֯u'>O1;A[ϼ2(?Q4Ϫqhn<1b.UZXm}=}=7[Cxs{wU;O-9NZ^|.ׂ Kr}Z\^:z|1}rO`* KGXrAJs& 3FԺ>%H,y8$Hbla&gzxajgeJgPֲaG/diN"}`6w썼y{C <* 8CvYҒPNbXXH6q2|/YZ(TyҞ?W(" bj\U }Ԗx93~-P];la7W) tzeֳpaU '[S F&j3q G<šܔ01Z'OC q{xr=Gg}}< CqVf▱?m\K[y,![nu O682~-p 6 M!$^ʲo'LO])i諱}t>5\ds]oi_blNټ&=CDGG֗Q h3=ϗ7OVMWkgb LL']wTO&f:|^߁בjL,܊&1K'W\HWpa{?~”}cʣ*M~p{}a"bss5=o}G$WbB.iW-}s;AJt?oczSsIMQ']O*bMI؛7Az!7wo7Uwo~O>U7~ y zwKJx5 W"DIJ;[HZ DA {0\˕xIJ hzMmM$&!pYXxJRo]tٚj.[Nw7ݺaEJ8`t^oo+CVU~KU;k`6P,u^MFEo?dժ{51%ܤnǩԮy fcv3Ek`s4?ࡧT:WIdWI\WIZRpup wÈUU=JRʞ]"\q \UʮU=\%)EWWq$W 0ƨ3pUvd*IpWWI \%q1 \%i)wJR]"\)`XUWHz-x<\%)JsBj; I\WIZpWW|ˡG3$^I`L_:a7q5m \&/WIww+Ճ^jj:q0#o_Ό8#*0#3 l< 9'tyȳυsc̱|:7y/{z@jG~.虏K>z(^|Aa^/n(HqڨSYq3r ݡ7Yq>S3@JѪ=xH5wh/9v3|%gS4a?j5 ? /)|dLrJ͙"< 3;˜|*G}Q枆G2Anj6`> yT aה LS3N wNn*;Iž @7&@Kpf3p+pj*IoGzz,+!cޝ 0i)"WIJ!D.aJJ>u᪔C+NZf8 n:.R>e; Ev9ɭtk|JP";We*7L#(w8TM & u&%IK&)=teA@`ig %Dwz@JBÁ+)4CpS:WI\WIZ{IJ-{:@R2]\!e0A\EXW*IHnz:Ғa-;W 0!݁$.]$-{JR>@r\gW; njpԑѝlNJpСR%:W 0tg+;W `b pupEьvO3Eqw5q$R̗_]2'k""e1TQl<>f;*ehBW@PWu%A!9o7 ] Z뎝SztW ^%J+6CWӕwICW>ir+kVJк+AQU6P]p9mP;fsfPP[[wnpCt%V+AT? eT3* ~+] `nH[+A^] ʬڟ#]m2Qsfpٌ JWtv,q?Ⱍ'yswكlxnHL;TzMIs:hݽ#$eg-a-ap_vD'0C`z!]Y `7CWצЕe{t%(WztܟwCtD~3t%p ] ;J~;] Jo!]3>o8vԕv+t%hC>vqV]{yd ] \̱+Aӕ A9_>z̕O~:Jem?1fA`v71{Wnڛ O^_+ A|EQv徠n7VRY[@(/v VC&z˿}/_e+7;޽Q2|~{*7W(1{ovkMe_?򮴳>>LKm?R|&7Kz[~-wU{+zO'{sR~D)~߮g/ի/WG4Fu"òp@_W5ïS>0~$~!bGs/?8|sW@Pi?lN?_)ևW5b9Ԋs֧9FOV :Ι:@-RШ8ƯRɷg7w,~jO^]M}9n`.N.\.BMr3#xg{L.p S155L䃽7]_טm-) gz0TstRk;;63J4Nz2+ <]hstɘs)Q,1[+ya c@"ZΉMD(cBm2&ct̒a221؍$7;YKv%ba-6ZTK4WBN?PR~D~akPGCzX̘im T34-d+7nf<[-ńc׾M(flqus6F_erPgtQj[jalTTa Vcv1TDcUyW)Ζg,9f0֤1ϑl=Bl:/z@nվ\k.$jCf0L,ٺhbύ Vn)YcUdB` `Ց%CӢXtGj/`gݥW 3)P4AZe6ykX[ ɞBEEG|c0ZX Å&]pcYq$o9W-%ʅ*C ]@2!d^PX[ӸZ](. -c5 ώ Ύ񬽶# 0\At(nEG4uk6>k54m*|ݻѡѱWYlkK` Ŝ`' ɷ3n\-*U'6܀mN`6ph  /A [25G)מꇡBU(MZ &u!fs؂f\4uC Ave kz T ߕF3x@æ~(Ɛ $,( 2**"͓)L7kUڴVAx`f#aa1)Ks`eippR( 3ω{.,xnL.O /-Y`{sO 6Y1Phq ,s4eEp5(ȫ,/LF=AO/:ANF]VICUcWU]% ٧z(4YV^f#`HH/W.!:Bk(= 5tYgSq Uf&5,:f2XaxUu^ 4M6L輶a>DǩZ#./9ZxπyILPcj $k\uMANtm^XP ȨU;o@×(er =$^ns\f%6L1Fi& ڊ'Xk tdDhcc.)>xC4*[]k.;~@R@"t2/EyXKv _!n[2vu#ڱ,|Pt)oW%,-ef c58+{x,3A^J1vXΪ$1 iFrd~ȃ!:88"vx-},Mʰ*MS cE9 Ye3 ZN7'|M/%9Bb8hb"\#;K8VDx6R'y9 \ ioU; q$>#ȉ!GڗPF̃ G%@XN^]^=q&ӳA!n~zJ F/Ƶ#w`\K.U6Y^a-ʹ֒2 y!Y"4Fx31+@yu^07˳3Vy;%9E y8rkG7jv#cT&:U%\+cNv L1l[g',C)a {a'1|p#,9N-Ճ)6TyǴaxĥ k5x `%f$71`aRuM` A౮1նΘlFrE?,[ /+59VzPyBɷDv9m8+J5R  tj<ǠӦl?jo{aT dAe"UX&; Zfa-|iY`y!;Z)Ah@.iUi1jzPj⎙d PA.`OrӿURWɻå) D,0rj;VAjRܰIL]Xn`4&6y?n`9P!wS`C Ƹ/,0H5. z1/?^[//on/]fyj[Ix*TdQw2%g?Ü{?Z _k9M-hz` ]z/N^˵x^J N.[{ "V'Mӳ>.nn~~L m>ǫ;ywK?z!_{I.+|d ޴w'vًnWcBLtv;1hZ>\ F:19?{Ӡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPɂ:ԁq x9<ADg d~u4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:O иNhv:ALP'H?PzҠs DJ5A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:9ڠ-uvԑӗq3A9@zAWyFAuѠu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A MP7W>.JPw׻'6F,_|CF#@iK",'㷓+ܘ6+J!1}(s\s}2xJ޵6rcٿ"4.|?v" &d$U咣ݏ }-=,%jlU(!y$T=+_誅BWVS+Dh3+Gt9]!\BWS+D)T3+iXս =yu(]9 9Ey*BǮZWCWM.Jlu7m}:o8{R!׻:]@E&CZ5*$br% U=n\.ߑ3QK2܎7)\:\) XP1Q&([X|#Rfм·~ͪ.ѝ bNO߾q(C ϜHAqFI^eh?b5/dPx56DfME RVQ*G;},5F=1dqMq%B@w8*e*E.yV:JvѲ_VVU>*LVDFHPOɃYVzTҦot6N&W7vj)5s) 4Ȑr RY͈d ؝$̈́6$S:-_t>p{}`e^9> Nm B=;+>剅ەK9+_t>t{DWWu P*j ]!] J]!`zCWЕN^]!ʕ5·$ {DWX,BZ:]!J ]!])*=+\o jBWS+D)x3+MGt- ]!\כv@(;ujQs+`ѕRB7cWP;,,FoGtཡ++{3vh(Bs+n#Bc}+@k-]OWj xR3zAw; &pyU:z`@Ʈ}:]gL2{DWXղ/th9uB:CLz=ZT_ vt((tut%Gt{n5 QZU JrF]!`ӟ+k{;y`ΐ\= .덺BǮ(Js%\7tu}+D{Ht(us+ѧ;+, ]!\uh:1b+tMo78?Xb>,k'xӾ|Ya2rt ]qXswKsYH唫vX#9qT(F&F%Fڜz(ObL%eB+kY_ :}t(f·Lv7tp9U}+DO~Q QqX\*l x|;۹q·? 7PdtMDOÿ/7Ro(QU>EH'qL%"9rCc x?E4+4zC-t2ku4Ens&W!Vfh TֺQ{ FV_ ?o| 0JFAfnny8Ɋ1WxDnGHRF47n>R-uz_/'L= 0_CUIhVfh3ʹP_"&?g9?oߏ 4_'`KR%G aǩum.۸=PC~4Ӯ$mn>z7s]3IFK8)-w ċ\ <) ^y|]+n4B;]QNZI3 ABډf2 si{acGRe{~3dznskO˛Z_v0\k'~Qk?΃q:v0np}A[̋kt@?xVy?H" y{yɸ{xn- f@vs.@+P+h$Sn#m~J7*:j*Kd83h" %,acʰQ^Z<\#}ZP&Ү(Q`ڈWWȕQE]u0:   cxUr 7]aدқ8N(V;:vvN/>EY7oNoNa ; 1w }:,^?nɠ0)g3ځ6Ur_x,YUgmV#;8 ;PlycaOv+} A]jײy QoǡO} 8r`0^/ޞ).) )3|l&iA6GƄS1 X!̻=ꚉGsduHiW҅NsgR$ k=4Oy/x$3HqJ33۴Y`8'[Uқ43;q믥*urqzWClqm>6q:jvW,hL  $ D".8&jLd*ԺKUvg~EV)m| V1 :S&A5G 8S4jZpBCTDt6wa+>Rm$Ԭ`҆d$БB ʬfFd#Y.Ɉgtc;GO+vLO뵇)Y9[RԣQU/FWOzf6?a*LojV@cb>i:OQc*E5!+$J2l9V&_ ^o>[(ŸG|N@f*8WI+Yi)U46V<s U'=?2Z<%a[uٟ j-h^[>Ny"?4ܞ)tiX_"BZ/:-RtZlBRD;EpI >]ULJ$+eZ U:XWeϓa2?bfo0oa=l{7r:QjEQȃ8pp=B:fE._##ovW[%mmY~W׫͏JoKeqKCՑ'>(Gkxև..MzELeg=|Qb;pYXoΘRRڛ#-;#%73Mki{wh1?`L򻉉4ussz+"*Ύ1g> mv*z]2 V3c8WQ VEzdV:5&ˇk: L6/@dDDi U"[Kl= UV#6MkU>Ȼ̂Z"!YG]̳Jl$bp.ijHQT`&e Eu?C:P:RRK%2{AhFS(ތs-w24f%郜ysYvg~w' .Zvk Bw楇S9_pg+:C*kwcN==kNz||- Y$(%~?Ze<~\XZb_ƋL ĥd 's*ᆨ6x+) 6 =nź^r(>vqf#*; 'C2וt6$U)/]TWEUR c+]Id#Ig*nCʂ6̀\`][o*딹rqqJݻy2'm-wEt7`9"rOzu8%!*g(+ٻrݩc /rZ(XO gȐ?fd)ÓL>Ü;F6гaګ㾟Yֆd%o˙/^\q%E\do ꦋGϧ8W| PPziں4}y[xuzqz28H #)Xsdi?oE(j_OOꔨpABFRu$G:jcT01ᨶQ<'1GK-Q|CK'9g:(>}t+r9ݧQ`r_ҩDgYGǿO߼)~P^p r$B#$q+ 8lhau5Ե(?~=j'>{%G;5,^b?lf_AQ쏚jIOhcȒ]L Eq5 G6!1̀@[\xl#cP-éjRHƺl g 30.2ma!~VD<@+~l_F!vhNGRrw2"@qIj ϫ$ˤˠ/Q,aUmC14O^pDh3\1(2S0#A9}HUK5E,BM-ŷu޺n`38!E/^X]n[IN.EM|8ݬ$=`u%Gѱ͓:VcbAY-(Kx>0hfaI[j*d,ѝzXGgۢIMCv)AO'ybv;&Uݤ6y !5Y! 6D`F<@1y|Ň "Y; $Rq]ؐ\ͦ*uUG45sRn~<wd[z D)!s tt:s3;izbg1 QWhi+j]WϹdFwOdwΫRYǣog;XP$x1q/ă2L,h5T"=&b"v-X}=D 9j*c Ǩvk _|]i`s*E lR sLxʈa"mr1^gա 9^oIv\\>uʠ|iu60{οXha(B?ӲSWfʏvqgi>2R RZT:iS$y٬ۑr{v݈'3/΄ŐA@w9"^#=Z)Ȝn]^vf;Lε~mS]Ӗ]NO:g3 *lT-5Udy^KK$^}Y8 O:ˋP*y3+疴 E5D}"$t2Smתy:o/_v^c D_3E|pe厖[6{x۷ڒl[<rUKKi(&|Lb>o%X<1hA' +gBe֦ ntL5*GBTH` yuF.o5kCytesB^Xo]^3<Eb^i?`2)pNwWkw"uD=3UhT90!Q%eǐSO$<Ĝ{{˾Q1HL451sD Dy &dYtRv\!a? v1Z^G̴~"9 O&'^3~EG'ulݦD^J$ >؎ IO ii{Qa [ϟ[WM˻Gv'Wt:osJҥܬ;EҺyBϋƷ\:rdRyF7{[xMdmf;W:4BMWݴsќ=馻v/_5]O}|R۲X8H\;z+h2Bo귐Wr]M^WӔ~E2>;'jէ(#{@ڙt;*\ uM$)ˣ hqkd_;șG,^^4޳j /dmDgӄօM{ma-f [l5&ۉ-ER헵Gk:1? 0{Q6~t<4{/f?mL%q=lZc,{(xNxznߍ_ w\e s w\ejvsd\=EsfU&X1WH-'*SIU1WO\<%.9`;*+ŮLRn2Zs͕!sRm7W\v\ej9vsbbzOwG~htu=rkQKz.h3WףrۜAv sŊ髧B*w\esɕ+*S̶La\=As$5sU|WU+&!'hC ٕr+*S+L.4WFMWyʟ ~?X:#aVޫ˛{=芆+VFXEgRO^o(=ib4*E!Jd"j3Ky@%"dR1ĄDa-pKrqRd045ȃgCn4:3϶/åk^gGwMˋlF[oۈ!Dtǿr+ק>+5F+5d,J*ebFL42NѨMȷ\t!rRIHNDt H2Pn#THnT(ʓ탊2%jDM* gi@"Sʒ>P<LBYL]ګ՗n1 J U-gʕ]~FMXbsG}NFII %aBZJa4('()[ԘRڂDkG= yK`rq%e5KQ<K[ϖ\BP*r\"WU*r\"׵$͋"WAE\"WU*r\"WU*r\"WU*rݏ#$cqj ݤLa?^U\YZ⾲r€!*?>a<Ɂ41voI{9$ڽ Jv?׏ͺ{Y߹"N>w"b3tPAxRbs?G#:9|١B#e1Q򚌏"i͟BSH+t"8]q4V"p.uXz[nRuRBrhaѸ8e q,tJ`U Jn)`*|'jX桢]j׹PiΓxğ &oTv0}?eKm<R pg.p'!(fJL6`d1t6`,v̲HÒzD8R@yTL!/YdKkQ u!@|B.Ds{D:f4%D*@pjT,eEry QFI{ gC:-V4h]jNk(E9L;]ݱ˖%~Ι~fNh$FDǕ@u< \sV$#JGRz#ȷv-+[gZ& Ju/xG\e% PRj"wmcO[a Hۛ.-  ȒW8C2%,Qd3qd7sh|sWL#,8-Ed\U>{]A屪_ySY~H8-f1׋PcKPdC4Pdc.٭/|ڒY;໋^cy02Z!DUi⸧K^l%Kvd^5ۤo:y<L- g,]mViY }6[84FJ4 6)A."K|doFï/*xr8DKɤ^FBXY W)#"b1+Cʘtk;_n'Qn綜|p~0a܎&v?g Aq}|)NcF1@$m 4wFm6*GT KIwP+ KY'QIT*N_{$'P&yq"70J; YH0ENdsk,Qesv׼tm?عQysHuwN2ޑEd L<iԜYpC@x,s$k%Auba@^Uh஡.9'-D|`އ)]lp5咦{lcsw#ޱ/}5PߍZgnnT*0.)(MsK3XZ9rf/=.&Nwzp#@B(. /)QoBT o2ɳiseCH,RT=q/ǫt+v(b5)*[x+|ԗWP45'4wŚ{_'`e=D=4mRhU{MNr5 ICd~@9Xɼ.TDDC_ۢWۡFUt+ޖ'-F_\;,&t0S RbT2xjE.˸sNc"c-a^909%&|,{5^-,1W-+ѱ6QMYlƪmEtT7<^Ɲq ^/g)}{EeWכ1V]u=锕9 .g狯jtRoVA8+_/ȹ̥ .r[P4<0smx er+gIqW"t^r6SFӵ͑P+*b<J3b`>v|c̱|</ # $g\Kr 3/ {`=Nz Zg/y۱7tkftN4:" uHryWk#Lg:Rjsfȵ1OKQ&zGz8j)#;uCY8:е@WvH@HRIHg N3_^-+/iӔb2g SDސW)n  ͯ1KW }bYˏɾϼ8;Nn/Yke-e` 0BP5;LjnDh:vFV j.$'GA ¶ s`΅9 {paf:?w.֑w04=\lpck!bΤ [ab@  ʏ!ΖEb l*O5x[naXc- aК1,R C08D +%c1oyzoSw1vvbדkG̴^r-i;Nm.:41H$0u[@G䰤N)0X;RLk *%C^A^nDe{A)k+9r4`FgD F N(s! _\e6$A $$hX! i&&w!Nxiu:|ÕvxϿE\8RR!A3{IH""a (&!{ktQƸ)O|8R)C]ϗf秡g&/pݘRLg fM[fON<>,mt Y٣><(~ LH}zR%01{*ếOμ/O_Msm߻-s[  ~Y`60xjIƑ_>u0y-3DRG YL|8W]<ٻW<Q Z=jM6>I) 5T2>\m+fil8R}M]qS?~珿0Qݧ߂8벁$}'؁GCﻟ7񪡩bC/aS^lMP~~v.սR۴f`EK%g Pl=Y+ޢiBjw1e/"h*]R(fi3p |CUYl N377MʰJP.=OM cy#(wR?5?4O_:$e*!YĸQ"@Q(kV _s;:Ri¼o!9QSIy$܈R fj2֥4AvP,BUhk0e!`pq ǃhp2gӄCœuۚj^wrv-"(u4zɉMbWXJ,6Rq*o@BV̠hŦFX3MD$2Ir59bH"jir9En %-P LkjGb'"U) J $ YBEmKĆv"Цܽ* _JkOh(vDq&KlDPDj ZA-7w~R'-͏uL`C#8qj,EZ[{Ǝp, * PS(Z,k'^bY3#Mt4ቤ+g]sITP9m/p 92iD:ik/"4~AK" oYfY#Ckxz#FexFԘkE N/$*k .z:hO5WVcKpx,@)Y1 GZFvفN $̌ w K8U[l,ex7(AgyRn_|-Y o%*U FD \H5| Q,bdhDuۊNK]p-OS: %k|N]gm9;f 7Twmm#~] ~ d3'dYF_mȔGxO5)QlQd3HluU]_UWW)3l|!ʖ96JHc1FmP&'ARL:g_Σ/NY˜o$]aƅ[0'#P}v6qi ^R.R@7--@Ϭa , ;)".:OXs>ʡOV׫trtǽÕ{•L3v+}\05L6xd)m>^ۀ`sP`6$Qs-9˝R&t᷂;+`BÀ{\!yL%X`T9F$㑅H1,}Dꥦ0+CʘajST? Cfit9/?O9=N춈Fɶ^+k)L@(%a4UTsc%bDOy^dH5Ч7nQ盓*/rj#\yoFn"LE.}w>ۊ` 8a QLvrO?lGĻv R,Q2(PFlprV*rjJ0z'&..$>:hAIEܶRXp$j ́j\&^gKߔ3:=KWL캼  Qwʪ 7UU,²Aܮ%wũ?;̀Ͱ&5x?'s q0͓h%taoRwj N'js-D̙27Tʎ01K};M>aiEˋH=%ܦR~kYP /Φ$3"p8cL\(1o|{&rM}vUEߟpVwrh5:̀1H$0u[@:FxaI9=1?X)i-XYXx$A^b9"1A#鬬c; Nj6Ed0M,b1 *bHN-\x֏x5cʒ>;(?Zi DMD !GF*mJz@3bf !j]mk\3 C<.&OI1q콠F̵90#Ƴ`ASVh'`Y cp?jDY+*H^`Q/I6A7 48NZ8[{ EEfϿS"rI>5G"2@ 0 d5Kjatm\eU?S"ȟ/;4ȣ}ΖviSDNm;5Q%P0 8+:mqkrTa5 R`Bz|,h62> ]fW̒G lPE~N]өgYngw} ~t~~z{~{O瘨w}w[:N]6@ܚ_"-n}akkho:4Uly4ԥs)هoo CoRRݻ[6afZ^!i*|_ʁ(s.&dnj]LAmE4qU3vW\B`&pRo}wi_c1[S ƌčei1(Z^DL>F8,r)~N SK `.0 p[6eW3>$cD] 'h* ?-UeC|`zC <"^K 2 `d$NUAai aNk:mrg#qJy5^r"d KF*N!32UUժ 0+0DF4IƶFT8G"W )0TB$@m4mYZ#gC9;[v2Eģ=Xw"qcC,%:AoFI:?buG$/&!)d)sx&JW1û8B OyJL3Âsf p+^5ꐣkv >G-kP!gT )e7;%2D"RuY[Z(ȭ[@RD{S( ƾ;0/"ro̰6FJe|9`X4foG{~$i#y(!;! P[`% ë{«b_~ҜO2r#F#DWºϡb*yyuW,Q`B<UX-zg՘IDk1ͭH!6K{r0m#БV6Lqfs ^.Rx=#n8 &ו^ʃ ^ʢ܊fxW;l%<<tXj@wqzT>Ꞩ{hͻ\>psݔȵ-Rk@S_Ȁ.Ε~ɡuOa:=m Y;e$nݼu]ϳ[Xy>qZ~k]oΞye͏Y}t<)Xqץ^!SM7Y_e)lϕ*;KO>he[/FvTek3G" (27Zc Xy$RD[o Cf5)<%havB]>l6W_5Cf|Y(xJW%W-AϨr^vqTz0 mRV;N8 qת;y-Xhm 4wFm6*GT KIwPG[#~ ^@t$wjYڦB^βܓe ;DM8(7}0YmDHn"'@ 5(O 2Y̙X hI؍n -eF_[i ~M8rǻc#0; q5|@@d13Xx~7`;^ P*.&GWO:icr-\P݆D!lPD\û/FH$W?QnQ)*1EI@(KburVrjJue+p4TBlZ* 4]4ךVzZ^MH<gx)I6y\,xH>jy,߷jٲjYj㱻)WX]T- {~_F]^j>[ͼD|Z//_r[Dnk^Wq$*3_Q/G-ϟ7i^fKyLzp%oZ! <2V! ROJBt1zlkVq~RMfwwyt40{u1Qa4ucSNne6InDynwHqf腸>5fM\%n17Mg>ug̳FH^:u-(gl`hb2.܂̋$[ta`P3(CˍcI{tPGm}/*B2Fm/hkul8_ne%MqD3t/Cq\֫94*YEj+t Auw 1jeV+~?ϪNTimBN z+(¢:YT]Nޞ'=FoONj_ߘrJQk*Y⥈#>2RăL*PKykZ) sX`OhÉ:޶ 6^!ےp|<$ґ=ҦUC}[9t74an/mX|IcC._E]!^z "JVqeiB́  Ηb_xM9:͛@|hm%5Wd8yP;^@M/XNGF`_y⎝:Fsv\pL.b1J&ijb (w[ !F%S69k5I-x?5??1X2hJG%d|Is][y#8\Ҋq*N)}s+2(ٸ ힾ> q-C7vv驱vZ]?(^UJ>wU !9Gr-GuH2qkB2N AR-lB돲tft@xiw;\˛QP{b#٠ɸ^L y03U1β^Ry NBPmc"^>tY@e?’zdĸR@yT,2%B`]H"Kr)k@+oNX˜є jIRQ*Hg(n hAZAZA96h{"očkU »zuCpeE5.t(E=hRҒ]" H],.CQWYZ]]e)Aua?pDmwx睓fG7D暌 3Ԓb6r, h6llWo 'HRK+-@ Lu>˻Ny0#;=Xۻ)?==}+..o0hFЙ]ɿז򛘞F^|:jđ$xI:8y^f(k'6!gmcT?<_m ڭ*NDQ??ĝe8kdelHU~-Ǧ:if2.@>M JP\Xdp*T^kfE4 8i2}LvY)Bt"T'rt=̩Oe%ISfv4.HaUҒV@c$ )\67XS}6eLp]&6?~SB&QWTkBPU,%E]ݠbHZ F]eqBi׮LuuFBeQWY\p*K*v_< D>FvqRuYꗃ* }_^Owh\4@C D5xeeA94j"$A I5JF)!Tc*7*zh [0I k,$\fJʼnCfC 90C0s& ˙Eм BP|!&2O*g&R8MqH sa8'Wc:qB&mIsc 688hJ$XJ1DВa ͢!4 /  ⾫uq*q7Kv|W_az@* Z E]kWWYJʋAu%тH]6 QWYZ}&nG]i¨13bl0`kWWYJAAueᰫ,fCQWYZsqq(,+8pgB=J`.hr.V˨㤔7O}> ^- 6c{dVk\ X"UP  +5J ;cu ,Ձ(۪*)#ȜQ9ָv)$$MDj'F+$&:Beײ AJS ĩFlA!(3Y5VHu mm"粢s`­ >͉;En3\LOV׏.m*8N~B0O*ʔqFj5 rLBZ%Ȕ$2Pvh(Xj#jk%NR[Uُ<x>-RG{k=ܷlywMLmW‘ y,)@ijRL"C%+L}dP‚7b(Y*C^kr58:yKҽػTcIP{ A٣1$SIIψO9Ď.T<{O4\3,[QϦt\?ќ'{s:~ג?!FG8̲tvy5z~^~QfͭksQ,ͅv\~u:;mV5Q%"wT5Jơ; &uhvPHHtz9=6SVuYJZNOoR){q;P?MJ J"7U1=X".2U+0VF9SC9?-Wq~z q3z,yM~l8ⰭL>sUlt=冺3C1 R@D hf(ř@q&eGb% z:4XՁGN(QkPLnoT*Ag|Z҆i\m1v!'̼-"ҩv!zv#[n]TO'dvw/hdžܢ ӛ~c>Q=iє||&c/OWv4g:7-{,k̚.+)J\)o txnƙ~+Xn{o nZwVDxj`hΫn\6 s 2 0 0/z4nYvӅU@Πxv LZ/7%}Ƕ}BMn0B2FU=h(n'n۠l8#){_Bv6#h}fevX?!QFM=}ls']-iT@m3ʍUw Pj~zm`Og&޶R/.8w|j;n/UZ۬}¾ʾn(,NU'Ie7&&&FRJx)f;㈏́!8 !%._lZqpb:N-ȻiFȶ$#{#_}yh2t$=buPCKB#O8۟U6,c!t.s/%r B!s_|=vzw&-j8Ͼ;s+ypזi"FΫzjzٸ~Xr:2߄ߩ3n% U6F)D;MM.x뜡2dDy &g F8A<'Gg8x8Q Q"i  2O}KZ1.@)Ecwnc-ba6AB\DP=޳ڭWkwP{U)ܥV)¢r!qk 8l :%*I%  ?>TuD1pr-o6GmCnoleN&'zQt2- PTm`;8zK%8 A1TBbJI L#xge@KB*fJYQ!B$^ȴ"u!@|.ɥa9cccF\vF$AKH.8!(Q{tVQycтs"m;(D.תw5eYG%R1F2{ A֟ A-4 UG95D$*ԿEU$I8%`$x:nma0I*WyoΙ~JY2Qg Bk yyFd6W/#|]*-@ Lu>˻^'EȇU*TN8S%&9uᷧ &;Yj w.@lsJJMd3BDLD$4BhS(ՅO.ܹޯpc54${;d|wm!f_߳iIRS%;x/TBTO"٨ދIju7ËIUO VhD( |+EAN^~ag, 2;LFII JV9B8 M4PFZ˫ꧼ:q>;-TfLЖbw'V/!?)4ݹ?yoG\>9.%\$Z;LZ\o8钲2%0ٌU(}#x[}\_o~hnZ^C]7wb\ȾQkq6b6yځ'nGNh5Wj4pEmkmF/=6×`MⶽEӋ0F\IN,ã%ٲؒNCCoÙ*H.ZgW}]s9xV394ུ $C=2A!fSu lh}uhMR,']ֶ+ 9^$^뗟d|LX9L9o/"ʵŕ;•c Wק%&CKBM#y">4'}xӈ( "슖)~<c+"')TBL!a{9h0$+TDIk6Ƞ2`܆>d3N mb:2}f05NݛSڹvy٢l˲qbC GF39E x񬈗"72Z1&WYɜ} Fp*o41m_yc}3@bF:Pq3-u5vct @]GL9, wa*o?b??Qg'EG`x?<^uG~+/?ӫUQݕ$5eI IrVoKM͒g'S >At_#)ŀV& 6]ַlH&<_fnSzv$[놩IGכwgo5{ڙ 0&D_u]]AW :čpNj &^rmG4f'N]C_r qB4U.%SA. ,ZС8`% 'X:&c:#cSeE ^x>X`F(,l>dL(dDҋd.B BP(gD dDKаm9:|bDž.uTq)gkk|ew\?D1%7DYjy-˷.hwM!aPXtR%-&,8#Fui:II؂c">wA|*.F͊BY(Q2QV:r# ܐD'A@-U{C{AcPĸﰏ[kuIX7a)r(A+bQ"ρAqc D]c폵5m"HBLh  ;2!f,>B^\+H }A{K B OBzA9)f$ApE񖬹U<"p|}UqI:>4*0:?)>Q_?Ϯ\fOjWW&K5y?]# syΊ1Y*d 5f3.8:gG V\#?Y]ƤώV3@ G0x,% %]:ŜtI ҍ?.sWO:OZLWV''+dVs9FGqV4V7})@3(L[HiҸ:nͫ-іظlVXVucqm֎ROxvѣsp5ٗ< ջ?UxVxYIP9+)%DNك3?$zxw [Jew0ܿA3oЬV1o?e{D 1>NΆ A{n2zdj-Sv"`NZ9`R NEZ4~||&9pvsz;g}.07քّ 0ך$x^&O .ԶYWU9RG°Bk'=cqmD{Go_gq3Gv~iCpEwq5"9'=W˺n<;^,kBZꇗHxF3t8/|_xVb ,{t߼smp/=EnJ䭤-RW羀ۻ.~#"ht5_ZWַwOݷ o]ͺY)nm{[_y(0 novjΛ<"NzGS9kΦ?k-6rtO/rɿpRhn,}ɥT8USRLJIB^R4ڒ`IuVv@BBBJtIu嬅UTSݿRio?Ou(5IF!p52Wzʏ}8.4"KRȆ0ZZSrRqD2i#sTҠfф?-cv<_kο脍`+ J^9&KTRm&xp2L3_iTk'Ш [-!Û[™5qsAlK)nWT W[ZS` h}4( b#H%HKc"BadQ%Zkc')6k*rm _{81|m _1|m _÷}kaj SG0n/j>Xݎ6xogPٓ۸6kڸ65ۨc+&}ĠS:z{yeUޥ]'M*++UAWeD AۚO*u:Ly`+e:G\$9sY& qsgզ1u{6|3\'-}vJ4ʐzNCaghR1q/"qJpt\`w9=Jzm6*Gtȹ[oYO@IM^Wo`=$fbrjޮh-YZ/^-oanRvEFѺంgZ{8_!efi^EƙYd82POkTHʎ[͇H-QbQ$6lnvUW9nE6K\cI!r`Z6͌e\@'9).e D@.3 L$x:VsN?"E]pQb{An.hkBfdfv9ε>j|aXc>DYe޴zszs[YUis? ʭ f+.Zܣ\dZl߭fV1["n5ANXRjZiRӎZvdNk :ao\TG'=<'@ř.Cp*Ub"D$*EU$I8ғ 1ۄ 8.*A!xE>Aw`1g/e[EXӂ=Wlͥrhy7W$G*M Z^9l%)a ?E#B k+E?}W ֨,ȌDL&iꂔ*YaBZJ<a4(h`K۬_B8gǶy%)s|VyAŗύF S^ڂDkG= ?/$.) X$  Y `D坢H{OS nfZ'jcnK(vkmoo9Vh:˃y6(Hr~~0 ϹӈJ- o7K޽| ~vdq駜=uאgߟv"&bN%ΏCog~gAg ^lg(1:3qiC 'z8j&h)N7lꕂ݅cfvsWBKKʊ*c\Զr~A8m[@R=Q!)~tۤwWawPMg5pu2U,8TI"敏;rq ~ؿ},c_}izOu,)p&dY j;i @Hm1Zk=pfh*h Qԃɋ[d,No{lϿ~#UdG< NP/AI,y"zD`V$SVV1xXˎi { w{_]LKM,/CZkՄ[ҵ'%SgS5ˢ28LGueMa,XXl+ςt=gV3pW6eM$axr$Ȭɕ@.t6hOxE.s6o4#E8]d6V;E!2a|`H`Xw6ds9q<4@iD9ߦ|N?\y&V e5:-7)vTd:.dRFo9HYٕ RDes IDr9$Gy:B‰)DS ;LsMB-ъR-*B\d P .}p{>{q1pʺ?g)89OT*<$J`CE ,:jlPIa1[LsՊ/F[=; &L UG҂K2DW1]T,I)6G>`A0{9xHIvY?nI$[cwS]U>VX3]>(V柺2=WWz+sbUZIGVkqiC+GӑGdɴ4΃5IZFDX.5 w@u91_-T3v.q/`>sSpYTEGNهLM[Fe1.BN͎Ȍ F:bd<2:#8(zj4E%r; gC]>\>)K'jqulk-_g1>:ۭ%L ^ALWOWY)vrȝ{ 0$c\}g"bf$ L3p0$.&mB,8楍>kTnũkc"GFL(-rBvÖuJ.%%37xlJ>eLc:24 gW%epvL1-xFPaKhGUڝ8UqoƘUu8.ߡCFq5x2حς얮]&Uj9sS:ȍC~P9ɝ+@hsץ/fB~P\NA5֤n{xK6g7fTZwvS=K-Xh< nOϼyЏM;Nl]EWzC8\4g?kkvOmIAGҏ[yuDܜ .ԜOe2A|!Rby v깮^sUU4p8xhF X'%y~,dQ $4AFoWzVю71>WkR|Ӕ*/KB瓫l[ӦlmQO|Z~EཧH\*0vOT}C$ѫ;?s;}Eh0{㾮hzv5 ؼ4PFÇے|v?X!>xO*z}ԳD9&~Q7m8s##A&%^zK2^'8c yK3tI.e!:2TEeIx!|93 YD2%F0v WaXXzڼ&-}p? ,hfOKdtwX;fz"f*}z+ק{o>0-` 3'"S"NLe[<էL;yȫ«\ʼ&b,%a|.T/5VS+I2ȄM_L]H ET A~AfHY0#r7Uf*8#3@sږʭ gGȻ@F?ɳ_~Wt?>vgY7C|,xsf0Hܡm6`BI!/"iK tDjFUu -9M:"s< kUe2{U2u%$+3f&pcZs.*v"ֆ]Wű]~u>M>vr6YYp nRThL\%s l2j XE8x9zUl&,P;Vr}m<% !=Z"Rzʐ 5Ƅ̽kU2 `UNWsƸ=STonA\p|h v&coˏ#MrǧWD7./=2 kͬQjhٚjxSk8dXT [lϱuvC։d0~[WL#E ".B`𡋹Ίo "nFU iXB =3 ڽ %Q Zzdz]^3#$FFpNNڴ/0>!mq-QBy6$%+x`!G\Ha 0eZSڐe;z~~~拊oդ箌1fi)Y0j F꘭ѥhrSTL ǘQ[Qײu0xr_X6lr7w,<~i_7 ~ߎ' GX KW B+ۨ3Sjl`XC4hӆ XឝVWaY*7"b\L j"iThPH*)E2! 堅 B܆5viZiԴְ(Ŗ:2Q[YE쮞wj置Sݕ(4z!h}$ۨOj%A$̑DSmꂘ]@\“byЏ<otdwCV pnL2yƜR]&V,jGf3LK^{<[)Kyb2 bk-Y;X{_tF@$"U=0IS0J5O)o@;"#ޝ_p=>mIL秳B=6:/[>IyMJ>Qc< \eZBa7iY;C!`H0v/T)z{ g4Uh&;jK(57'7 7׫4B~ DʛIv2dbAT#$`]Q 8plG#8%EH/( >4O  x=d܂6aBV1Rb9a cR(s, 4p-xkbV":;o|i{f%x9XZιՐd8( SatXwbVnbl~Lp4߾っ-űCG'BG\Z.h2RHBRby vo|ׁpU7p8xp߀=?Q|*Gijv̳(O2GVS@cXY- g?CWkluÍBK%'Wy: 6]cmkg*G=ii{så Sk\{Kէ?T^OU f/jv7Kِ&6 z?W{4D~aە)f]8͇ZY.zS`𷫿xj8f+s>4#|!c|pR mp:DI4%` Gg98q}8]s@ӆ(]fّ8,A_%BgHQ r E#M4)R::&tշ_Si%AЏS?,4,<4 -xLUwq+;m|۠!]_w/bD{OQN%|y?T<\-,KoR11@!J7{7} ?B~A06+u#<*`d_֫j~u_ꓝl k> gx28Z%!Zre`Zak eT68[M֒\8@Dp7ڎœ€S>Mޞ"}V;FCR9 * hyr҆LއC⽐|l; B|N5f֨e2t_++\[XJι@% y% Z?%R`Z#SQFd(x"g* xVeZqZ^/4%~y+^_&qWī uwDK*&XkP6&Ҕkle1}М&!FG(ȼcY²`2 ų#p76UMfKϕ]%6@5S!I L.@E]|9l)C] 3x!dR1U\yDe,GK# v[Ύls.篎g^"P*Ni ۓy6Ųl9~+wKR,mw.GN$+dT\diI"QSl󜲥J)QCH_ xZ13@;OkRdAWh!ьiԈPhY X  M:hwʺ98F%LZ% "1.Br٣"H3*lP2d)#$?~Nڏ4HjGWTic $ ٣,@K[/M(<)Rˊ"rǭExSo3q){'ѪԇzkJ\EI[ul,4#Y/~$@ YB_%S¡翟E CqF օ3몮L.NqMILc߫%b)+4Q,9i+O{V >F#Lz@Wf+Lݕpo{_Ulrm]  ɊN[lB Nraj. u7-7IBc,/N?XvLA$^}3m~Kף]EJ`+]^ՆG+Ry͓)UHvirwg|{Zf_T\x8|n8[Cx|c{{7[vnDH{zwgPYKR˚!hY̼|E?(8 z0biGOmvﻣ /GtU6:}ɲVKU`ԥy SI s .=1"~pg0NZPE7*,zRjw Q`Cf -N:3o\mdNe; %i,pyo6@[wY2Z*<8f6P1\f1Olsu˭:T݉wr9E{4KG+"R,Z1D`@-.Bs^4ڒHZJdpVL:@i6(҄53ˌ3L\UpsCJµ0tc(67&GkFCZ|`zC <"^K : d$IUAai!aNkZͭLs^zv-"(u4zɉ䒍׮YlT1-# J Y*3=0r i"$"MΑC ɠ6VgQj'[) p.& 3ӚZQ]%"U) J $`€KԆ*P |~)?!x/5`Oh(P;x8 %ZD"C[GPD6#5*3ܨIl^`FpHk(JzƩim$Սg;F8SDz=fPi@=-qTB U&/$XVn[Lp'BuS}'D9L^06NL8i,RP.W|belW$^ Va`,R1׊_HT [v2Ehգ:c<|vT&!jX;%.>ibuwm[lrKH@) v2MHhMΔl u@n(5(VfZ[տW[z-fk+q- 5?ٶy,-E?uJ 0`Aic 0'6sB+j_9-|5%C_UKdS*$3]Jz/CϯYgt`un祫}*yj/̺{oࡇ= V {'7}o_f3Tr5 üPza_56(Ye,:uv?W~]JNah#%ӏy/ sBg2XƄ>1Eު8WM?uIiړe Pң7x LqY̘C:,NcʠeE;nj.\9N,*3aה LS3ˏNNI;U(l;kAIE>Ӓ)E 0s iy1{b:$An@`b'5NpUc\V˺؀"ѺbF5k];#Jp)j ]%+ P-]] ]Q&b +Is .mur\wJ(|.ӌ50'1tRJhOW `-]] ]qNjRz>6~pj ]%`BiKWHWsFq*V1tp9jUB;)]%]jD\ DWX Jpec֮S OWRvJ Ȑ7Ⱥ9UBKH*3xtӹ]MZjׂgX .AZY{ Pokt%vzqkWv+yj?L~(\ZЕ؃DKWRmuPT0;UgHVy- V !Uu2XC@j!Q {gi:V#IC`?s Xx:]WMZ Xn+β[5 CygrRX2=cjԛM#VޔU.2Ky4:;Zq6Yl{,A0k̙Sݘ3e)3gB{j3T9/p$BIDWUKtS*etPrp (UUQS*%tPֺDb4~,i i?_3## aR sиTL@aBӒoQ] v,M3ˌn3OΫʳm\:pUwe.=B8T^Ԏ-ɳ %y֔m8aVgfVb:4Z#TOopUUGTйVgSc-JPbB)7jV) ׶K+X+mM%Ğ%01d 9hh+tj[+E\ׄ敚P҄Up;okO=cF@1gz;j$-YM9U{ 4,(pc>e*"J`Tt|f;'вAp,)oL40mEFås%UyA7g}O.}>)˷%-,U@4#!dFI%̓H뒶FH#g Vtt9TyG$A+-JT G=+ `T9F,D 2KM-1<)c"F*=0RqL5_),Bq0~X}e3mSYC ϻw>3=9.\|j0)e1bmP((lS-U58Ҡ cCcԫJ5ịUh4BkM̨3hQ9b0XJBԀREL\zz< F$t575K|b]l_.i!L-E:=4d[ox沜47tYś4!by̴6b[rYО/P܉,sMq6~h1NQgwTc;ݓBd`Q'?闗V>F"S>RKmcf98QlS$Amme6ր|h3Zs*7K%bzCπ R/2וy*YP,dLiiu&bMom@#ҥLzyՉV=It%tEQ*gl0cŚ)[N`v]'9vqB]7MI?ާaq* KǞ "")an(b΂wDðq u*aVvaYRiKPyC`ɂ/7Z!8RǢS1/{_cpTi}مf[w9[/o=0U~ HzDKCf+?>ѝ<}Y"%/[zy_Om*>lC46:w=;F~sfZizS4 7%+Fp5EsvI ,=dOtR-cc&Y%j6iiW^eaF3[l󬡙yϺ>bJJ0Ċ,/NL%"UŴmR6v;@1 +mฤӓV|>QrhnNِD|&˿p'Cܓ=B'Tc>Ľd:Otoy={өl< xbHԃ3n]{vі=~7<1ec1Qy4= MI#jM::iC JkR+*o#SފN*X<}l; ő %6I%AN𠈧V[L=9& >F` 7g*='dsf8%&|=>y>P)R,o۵5q%yؚ_C*4jUMMJnSϔjOaW٫$9Qls۽ے}78g ڍ(@$cDS>5ӅtP 0*Pw˽^=3k~#g\&3e&#QHYu2LwZ ﵌FMFSQYoGƭ7G 9 I XF|0a$/U gj{ȑ_ewm/ŷ! nfn0~`%EIN9bŲeNc$]U|" UDc mq2m=ߒQ㫋QIknM M+[d! ˯?,VMJ{>wmM1#RQ[821;ѧhsoR\KϹI[Q맲u ic]CmtsIn`0 &.Hmpd̺,%p:Y0 =;;ЭOHò!bZL9j2RX" 4ǘUdԸ1޷a_ys:t 阳1mjR p \ # I'͍L6C(бhiuOM{#vM\;Yգw>Stw(!8PT+RNo4[H%HrXYm L.9TCU.:H]:sW]J(˖cd\LjM!cЂ$G.3+vKO (qq <*-fu/zGsN &×tlR^&f- 4Y&ॣCxs38sNywT9z{"OSZx|Z7Pls6ێ߀fm RmJKUȕ0UaB`U2wu&8`DI4|x 7"J4݂&#d`RV1R轴u%=At.2G-cֆae,` Mb}Ԙ4ݻ8wFpF;)͙:}nHSL"Hndt9^ bH"ȊbEH+gTZA II<7lEKq,a[۾y ƺە_l0N5`_1@Dg/,ހSRRik>"|щІ 5Y% ӕ1Wūs t8i[i[V\P֝(kuk tŶCZBZHv8?LNe=87>έM-___o>_G~+\o?p=v7JgA^#LJ/Lkč,qdz3|spB԰>(DK/8ݦ[ob{g ]$/寋U]ZԶanyGoެ+9/jq8)M/>z~7VDLfGE H{WDC-9dǽ}Wo@ 秵-E)z2VG5l b/+Nr$`1v^}Ӷ7>跫ze4cTl_;L,o͖(td趱j4r y|h;ˁ<.VKڧ0>'o/Ω^[0't/rZ 191iñCT|(#ا閒5f֨`7j`;(B-*(X  4L,:Ky< :eۈ9D8F-\948!{ 2N:eѝD'WIC-M(Rp3Dʹh=O IH7tZ#fP eEű;XJmZ砚 F?sq)w5ԥ?AZo͕y:94NdIzsTarC⃙5kJ{ʤO_ s$52Y_ '(sg\2qrNZ '~2I8:L[͒rKSL`ƦۇlA=xEgrΊ sv9;?|L{ԝ~HW _,̾XzTY͉qs:LNVwQo{GV/lgP0o{7i~wov{q3ҏٕ^#|9sGwgs+ܮ8HԆ\N봧Wp94|5d\cOn鼩܍3ݬ RĘF Y,+h8ZLy;]Uͽ.g՜]OduGPʽ?'7}}eJVnhTLϬpvIߜ?o^>{3.٫:{%:Pv]6 A0s!~}|e[t--w}8KeC5(~,F/䄚$OutIgxG$HM"\!\`润OF;=;UȕNF(aroŹLDԴ&gŖ'AN0A8QȶcR ,dP{'[ֳֆSAP5-.VpNT YZl^&gIB!pcNTOOEz ;}`7= @i(QKe8:T ZXaE*VaVkP+LOxҚXes &x*>k"(Q2琁^fF י*z<1tJ-ֆ93|4^ڥ|~3 \n%\78 ʺPIǣx(сG P.p JꂺLi.O J(Aw%;z,0QTI#9*1Rn5b2GΤ\j#sPҤviLJ>%̢.%H\d3LYzP,ajkrc4AidWf_\}~|t87cj:,޼E˺ hj2MU^κ͂MW]L6}tqO ;V[AbIIHzKbI],%uv#ū['XRS;ŒXRKbI],%u.ŒXRKbIkZ拗81-*2%+ z4Cp4 < =e:0b_Bw"[Gݥ=R \G6õ7N)%VDd^$ȵ:qK v+HR Oed\3d 6 ֆ, F8DˤMJ׭116>&$[ pB~ϰ}7_2aXlaѧ 9kW.~*.M-Tɹk#&GqU} Ғ;+mI / v߇c1y v&LfF^dʫÉUӖ%JbجWUt@[8eV;̀(_f4ϵ4f[ __nխj\8e:xKwϩWQ^lp3VrH# be}0 0 m<'Qn]QD-Q,M]BQv8K3: xE#AnfJr2H9OQyOԌ_FFnzw &!Oö0 KRxdcOLHJ8H%c%0liC4X'VU :i7 1&O͂!Ae]c:_|Y<_/<[rh%a`7ZÕpT~U%gG300=8k o#Xn=XTzu+r{o܎ZcjvT҅#a rQR&(`X(RJSD̥?oV^ѫ]x@3Z0I;¹$t]k2p$Ϧ9ߓ+uCu>+IV>?/jz6.OwƓa1) ~6dhe^׮Vܜ'g`2 aV㨿W[iR~ a4RG3Gqy4ءnwf&/*̶eAӯ"ًHG)fL_}sE m6J+=^$w`7.u L٧߳Gu'>.Wgj/w: w* oFvw߭-Mzk wydv 擮$/z֝6 V'{y`oxy05֫ LCD=ʤ {az'%kۆK2]_twjy/t[_bwѫZz>vLl[WڭH'2wS6p$53&hkTj o}ǔ1ؤGcܮYczGNPiX'"@'}}GYjDQnawRअw y#kE`#AJl4JFAOb7z)rLD|%l*)ybrKLp0\oY(QWu%1hTiGTXp&y\szA6.l:x]\w SAO3S66҇"b"s/b<Sip$uakxfAn$*Ű[)r-ۭ󰈆aY/j-Cp[M]oѲ-B{TYmhU+8!GkbZ!O-&x#32Ts`̙?\N1[IOR݆h`ur6 ۑG͓_]~;TȀ'ZK(Iz ` %^FHHLBDN6tʾ \Hc)f  D9#&Ҙ6$움X'c  #$$8XG!1z ",iP%LR#1s:ֆ5;I$]JڳE$EMoͽKxd Q'[btRB>LR.X&02ș <<-H̊.@v.9&2cJ:9&QG&䀔DsUZPN,D˵X'˻<6{6TJx/kֻ l]B;G1bbΥYԨdӝY:T{▱$.Zi]XZ:f1grHxAu(tPufщIcKlCJ IUp;oB!gh՞2#,RFXp4GR0w%S{2d ?6dA>uK3<,[6Lr໫WŧbhHJqW&{\ ÜwJ IdmilP6R}t2||K/ŠUF\O[ֶ{"G7(2$.0P}rܛp5(_W~`9#\:c.5D3OwC444 s\1"g*=\ Q1 2m58mAa .AT "AAz'dž8kUcmV[}tE!Xb>$ ]LRtិ薘%%4-M[p?z>8Wqh)q#`;CGg. b`@>d|c̱|<EmI 9Zr;'MHz/—%F7oy;Cc]z+|rێ ^0Kp^SA'Tp▃>Fs-D̙2 S01 =6vLO)?, a3Pt06Eی0F)Vƌ5ֲ &К1Έ13cJɘE+»]$V:`g3bl.޲ |gIsPu[@G䰤N)0XZ:lEbkә vJ}tN[QBfJ\ D4JfU[Yișg-ݒe;zU{0pÁdtP6R1¹$tkj2Hp$Ϧ4l NnNwաgΪgޗ% L_l6maX*{ d߽*>OJe\enO3v0HM Ymbz i43E>CRP o$(˜Yr! Iz.A!m B@j>L#7L;nKu@cJPHN-(U`SCV}s$\.%6!`~2ЋQc(,<\ N$^#Z[#g،_\X$Qn2imi's}C$XHxzIkY4;:] DMD(T.^()NEA,!1D"d p w(5\QS{6"RxQB_f) JJb,A"A$wqǒq/B*uIX*A2⽠F̵90#Ƴ`ASVh'Dw$?-\2M k[epPI\ ,j! i&&w!Nxiu:|͕vxϿ".{))DV I>7RD"2Vb$ٟc}J%a]rb0՟oC.3e+ߐdE\.,| ٵ)I=K0gSX)2U ;vk]CNףm3[qhP&?fKT_UU VySa2^gCxF̕+bwqY-]Ne^KS&3PHR7#]5 CѲqEfC1(u`dEv:ѣ# tuTuM6>h>62>?.z2wq6Jc馊sSc_7?8@Ark~ۊ" X[CxӡbCN O]:x=75%CQ1uv"P.ަ5/@HR_+@9UVeEӄ,hEh/aܟWUTͅ]R(Oij8c 6bAC˛&e8Mѳ\vϛ%Uj;4)E1X?4_vpVGpgFF84  R/D;:2Մy)_1Br (G(%H"eQ)q3MЂIR Xb)D"A%?At㥨g-г$e إ>IzBH+z,o)J8- 9h3#w/ܪ;铇[QelDwSd0.gZ{8_! ldGM$؋uO[划cb>LɤD#Hl3鮪]]'`pXBHRN(!f-`0&3m=Oa|2s+f\V*S.1ZEJ"ALùC Nky~9-y%k*LihNxt!y s0y4l`e jNFH H0E[ngL=> XMv.Q!U`ڕD: @V)J3'tE B"gL A$v(l15:YonRp}Y6<(AHD-nr6ʎm +GXȸBj67.Mc^OX "T!O4ZLFi+Ơ)Dš&FGJjujaztc|_7ˌ dk(&%@<_2"ht`I3lqt8j3x$eCxixI` VIzNҁVbd t.|D[[w+",[M;<}1TdS,ʒ[Iͱ${dG&R;`z/-(BLC>:" ڞHqi,UWi6z;i6nWuFc880֍ɶvi02gWC<f}Q= h^5}e b0T]HzIg~I>sGauSܨ}5#A.L}0~ͩ>} d̥Wپ ;q(lL/m> @ QQ i@KI&&ifzܓ:=|ŝg5E^08M¹YU ; #T {'H\A]UIbq⮪RѸ+s(i+pRS؏R808w$l< 6W0t{)Y_ ^~ IImejR ɯjo=oO߽ߦ 3FSvН'Ͽd!424Djjlsu&@SS_<끶f2TDԼNo뭴W+k6ڻ.LO,JO5OM.<>edt&K, FAMЁj!e~vQ9u#pF$Q;!ĕRbOZ0R{ӡpX`x0PUV}wWUJc{wUM&CrW,]UqAҢwwUԲwWO])"Mp@6pU͡*{wj0NT Wc(Vy6UU'qߏ_ۄ) cnGHv|F3uWlkϾ%)-5anoj4(1=[\9QҤ-R |A=Hg=(bu{FeOp< S\HhuZ< %_)5%\*u[Xk#Rȧ!efpNae[G!k&LRҰ DU̻'m휴՚“πڰ7kƺ+(dQJ!*Kߗ71( Ocd$|;Qk>G^`a̎CX2.! ȘjfgezCؐ6)x^ZY^φypϿFilpu:L;zTΗ?U\EfO2Lϲ֛1yV`ƃM,#^i$V>.O }pLs㺄G1},@ G-VR#?0x4d'xvLuFdY%R> ςL6 rqI>rƓIբlœ2<}fҷ>B藫˶Ƌ7~4a=G!VdieKtX ٕ,*+6ĔG筏Xg㬣ǟ<=;`aLo Vg/Օ:}ж:BFjÚom՘WH'D>cNP:aA<}/|Y~?Ǔ^:N^ɫY/9ʘ+Da[ f >k4bWMs{ÛM/Q?V}ՇFq:dCu0S5{ P]Ċvv]'gu!cUiσG_͘Sy)UۇVg8 4 F6~jc8~v^gQ%"4nh YXo_YA9( Cda̓XId, US]v22ؐFFd@(r#hI"h,c6ٹDTIV'kWZAz2E"[`x*:3/g89K`P*&LPC~g댩'ձu&Djg?~܆'3pAFSUvl^X8*B ݱͳF6"~{o!YOKPC,IM<(9d)K27c5}4dS0[yW:F?<[fzcH@iv2KHMvlBHM&:e(è0")Y#֌5S9Xo ^饍)[3e^ڃڮ(*]DYsHPˢBS DT3t6e{SBRe~1\D $J)rFy%apAFիs!IoHN7sǯLѮo lrQ ~ۙ'r!(4^ڒ}aQQ i@KI&ޥ.x'9pus z\;wŗI0{meY@ۯhE"pg7l~ ߘxqzYk4R@CƔWJK5?k} <fZo$˓8k]j<|J!z5d{%z & w,ν!DS]iY/ Q|S75 7}:~6ee"y~ Qe4j5Y*:cL:ER|774JL'?~ق+"OuD=bWjg3)|ED@$ I.G"o=ІgElfr_ 䆲l/rkYK !G_L̕,y"b.SO]hzYH%G"eHRG,SH豞Cx4*R/ Dc93ۏ ukhʊ <5a+L >^QB9x<)[;G.Y& F"j\0u_Tܣ0"LuOْ,%ˠ)bٸQֳ˔RR%ꛤ2VV9. NR)T!EI,+m$GE𹳊R@a=n Pyj{,Kj{0S,SCLd2" jqDhnE*fgmvˁ`G2ȃ۔;lFϳ nYYuΨh* 5m3!n»w=|7}v&ei6BnNA/^=6wzS'iaW짘n?&7w9t:k}jh6N׮ٺ{߸z0779?.s~8U8gL>4Y?0q7T]MOZa^1t~ ȭ槊5sOm 6מ[= Ti/`U^d 'u]Siiu%bM'%z/Dwy;I' #;΄s§nhXCMPT3o pA < 5nӑӈ:"ZK袾h-t9fi"(vDuصyG/!NNq>9+DX;TȐ2NMU(4_ ˓t1}`}FXo{ubx8 dͬ˾ӧk7bȇ-ׅ(F1?!0 Mћԛ-L&z[dUA3Y -?Հ$un}ojATnK[*z!'k4(fQ)Jd]4C>XRY) CE'~ˎcJrJ *6N qIFz++wNVne Xr:V[.a7eEFE:Yxs`ZŽ%c7O:ۧV̶Ej3BS~~ΠMy+L6-Ǿy1lūN^ଓ9IA_D+(eq8fZ8ɰ6b#AJl4JFAOb7z)rLD|`%lq[8i967*p|{Y{eMgo.~?^J?p 7'oobQu5\ {AJ+& (g\3us_r÷ouRs|YDž YSțnc[^OzRը1W&+y7P@2|ڍuOTP3T@'ZK(Iv  %^FHHLBDN6`%|/vxwa irr`3 @A2 ('ׂc}9AC~Z+:VtV!!: ĤH[XA`Ir,a#ֱ:ֱ\4hZѴCjEk"$@}a7ʼnZ - Vh؇A䟖 02 쯨 *-H 0+Τ8,c,^0cud2HH)HA4w\9+) >hx:(^RO)7j<ne$"RrGkZ ј,MP'ϻhϩT؟c{BG@01/1O1bRKg cΔGpÑOo <%a[Z_IWocK\oCPD8 oW#6i8S[ͥL o7z%qf8Due~i\h؏p!y>Lc: m.E φ)͠Z\^cZk7/oKͧ6FI_;ߛX~5ۛq4' |Ao&#sڮpq4a4|c],8;WOT-~ujxd Ii&ua'D[cAG'BGL3Ô$# @g9VG I{*@*FrJO0 Hz\ ߔƨ/RA.Hz04VIW;I5?gG\pKpj+-DJ" Kj4zB䌂ͮbRb*[(uMMhM$x;@QK!Qa념pSB5aV1i`F1AH.)qoo9 lhxfe(fB}]QӿdKfnsׅ(Fqf{U[VY^Q(Ȫg:֓R~zO50w}&KuO-6w.vuHw .mie^6h,YtY Ы'VGVPщcһzn'mBʥͷv0+msظmU=twa\9Y%bm{jwW?p׋ԶaDtX QN{fzKm d$P"meom=;yKg寮mRO.{ _4u6]zJ : /@:QT]vN2~<8ZA!<&Lԇc]d`$0b )SؙxBLV2J`'s1Y$fPbþJ_>6#Ck2e(IR 3&dJɧ@\IH +mc  6xmzڃ[p+yӆ+YՃXNjl]C1I@t} &y=J*Ma6@Q7UlO+ovB7"wι6iɓV%8І%2gp%)yk =ȶ[dJMBoI(ɖ36eM=qPg5_&>eb*^dPX\d tZy7t9z{$ݴKW/.^rw+ΡՅoX5~\2[?45=v@iBLp@N7&8pgGb[Ҡ _ 0$"0% BqLV/ bR&}h$JB]N9F'TD}yWwQz- mXVejzb*-q|?>HӫWGQ;pt.{8{BG&:qC&H7N7au ֖s o*6 j(bOfE3hkKذ/ݠ^f6Ձ7JA"FD 0 r&IcL,շ!nc֤ϖVg ϫ]^|墭QeplL0CDF#OöJ)) OY8VU&6ڏt߷ vGZ!8l dB`{']@M%Мk-eI19%kobȱJ e=7b]wNO6m!$UU$X(tgOBg+6aHr8.*C>zڐͤLp<#2Yrzulsi珎g֞0Pv߸z5p~]xfg| 'K<.:rHy dwd]lZַQB%"1l cH:h=\ @܇hrktNyyU%D(>pI#(#զz{|+:>iVjZ 19j4I3P,XPdцd̬0^yk: $(3$I`-bn焺nYH=;&ޞĢ2暩Ex hK#mƅYE9\UBԌ>& G_<1Ug`<:8 _nW ]=-hLהHdPRH-;x:锎f~n񯷂"eP$aZڸT(\`Zv79H{sA^"Ki_/GI̔Gq4mn-}m oOwOޝ,7Zx+IWI׺\|E^zvk7.j-ѹ{_=]^ӌkmve{ɛ(!\v(~:Ã7llWEljOOZҋ7qC?!Zr%0tՌn&fV;bQ~^ ̓KnNY/<' Jwu@t૦oEz] -֠~=q`볟knjH]9E%TR8#s6=4YP=<\ }>vޮ̩QnCg  PW㮿NG\#ݗvU8LJ~i }v `_^_ bk=EN h^[g8N4;Js:IV&zkubP,/JhE:\@:;M_1C+Ah )]JY8D[<9'& A>,8LUw#H:zY*Uo6W/E=xM/UW1࠷ ̕{u0fr❂ף'QC~I'39Uٓ_8}uZx(t49|}e<, n]{߬nob҈wLVFHjg$x;Kݳk9O΄SOkx@+*u}Ib E:nt9| |RFmlGz:evv;̯m E8;UϜL3{:[0ʐ#vF-ˊG-ZgIJotsžQ!`} ܹ6zp>]cyFgw#@shW9Ĭj$[ LƤZxսI5I`VoP+:]  5Xh ^E%_TVT=s%km,dS={3&AOubMp_Ҵ4K^U%$ ڸ{)#lCt|Ah@ 1u%'Bƅí} e62Wٸ솪kU7Vucu=!- O/բf_t47G͸xٚn84vÜRC:e"gMu=sO0+v~[>q?zk 6/mLl4iu1gg:qJ~כ5(]HBBku.A2I*Dtt,HـN6cHqhfv|q$yt+E f@a€tFNaH:>?]LmC bL)]nm~s ܖW17fVRRQbbv9'T "<18]zfe?{CUM(Ib&$}&1`Uc,m0JP,;@m5ޥ}~3T[n?vxWO_. ȓ t*k~z >njz*=U#WrӼz>&ki 03 vP0 0u[`*lLL.H%SEKMX;ڻK 9g]+%+m#It?y@7f1ża)s-jR>,"%IQ%˰lʌʊ⨈:% 3pRȅyo9bYVg^q@pVzU^lFΖN#造o^>]ׅLFfYY YS[=-&j@tQb7s$<=f7L==zxyU٘pKgq\ ~2G3>!2+䓅vwlFW42^?=m`aozY)n?36_yQZ~ k]]oK6LΐyLV>Wyhd҂CgyOmᓙg%EL\8!䃢!^gY$#X,]BپL!=xͮ~{ol1KDxjh7_.A9-ñvv amy=d驆y LeIZWT(XXc-1$==^JXWoKn؃uMw#kpvjOtw;]NtwÄo¸os:uUaVW}ޮݕDPYrZ C[Žr91䓶C⥐\ |GEPJHKUO5X;Es D r&xe-$HmU)9G `Z#zрl5h`~ۄud̵ˣd,XyUm槀XAͮtDMdeA1Kdz$^ \`d !NfAWyvwէJ\*$<7cf;Fekdo~Zkë uwF׾.8`ŎzF!WcQ5_Pidc I3h񣦩khcKN!Xecԥb 9)t殧L{QwS-2K6H_jxP"8p \@%4L iMJhp&bAHCZ\s'^؀g2@ZM&'#hYĖm96g;-ɮFIz`lvy^XJ\_8I続^oKrX.1ԁiVk96O_~_( *'w}b=/u`c]ZLwvW_<\zK(NקW㓰(Zr6[(Ίt(5)i:7^"cʯן>L?x=؜ ?foEj?>]Yo>|s2ִ+\E2мՍ,2[ĘZG],'pp~=͚՚gz{U`WdӪ8qIBM3) Kb GLiL+JpzGIX3"dr//?;߾O\}~yo޿5:Jq@0` @=;K헞tkciZo0x{pӪf(! Rnz"PoK!+_@HQ_؋_!ʉ/&S4mȂjYL`L u07B.bB a݀@_>co i_1lNUÜ{ 7֓e8KWoM2z lL&,4#d~lt֏Cw9FO+dxQJ2PjKWGN߉-0DZnCʔxU\)JE$D>o%K+\[lsE1[ S `??ʆP+<h'%,SM2<  Ink[.Pl$STɘh9j Zj֖yBd$e47̃iUˎLlAxqb Ӡ4`ۖ di26J@hYZ#gK9;<~` Cs+:\ h"yKeat0!8 4 1[e.b#:hS>?*-_kOXT;D"ciͽUB&&aJsZZ꼳:;Pes$ x*>#J`z y@h@G'#NPHcri 8 V۠SQN<9P5U:TҡG'q9X;?hHUCCJ $J,!K6S- YF VmfW謪.kgXc+(+2-#!,RBz6K ?<i }pHeC S)#$:e23zlTUIϜ<6;ts~`UK)+Dnٓ9[4H僉ވj^k/^^W őv[*xGԩ`];- x]1^^50Xݭ'Uƚo-T*ΕG*3g(YDչ:z(0ATP,8:>kn%kc"G΄.9HSi:% 3pRȅyo9bYVg^q@pVzU[e i;->]ewb~ <ߌ1 0!p<{jg]ńN hz|.Bxr=gAcvSxWj8 SoKZ>ե+@he֏f|BdW' ht}et5~zxä1gf=cٔ[vޙf;).ͼem-?5ͮ%oyv7v&ai[IGqפYsvۦy3)t26mhuls+Ck<P< igRi6r:᭶&H]Ҟ-U1%Tɓ;]W79R}8j 6 VsY`=D̒`AJy=f4  JXF6 o8XC~:ULe/,PƽhFbZ)"!hmLbZpg(EUF!E$<mÅsxH©I@e$gSG'+ p=B焵l6>x39!%gC"Hp'W n/Ջ`>'>I#C`!l`׈-"`&(7I\ze M(g p[ŭȭ­rA)CrA8MId!*&E^#,խ/(^=rݺֻ'܊~-/V+ZϳN$T*k x퉊>7NaӹS 1Ab"\ 43чL"q&e#*G9CKC/M􊉳]PW߸Cuo@v,`tr/\%X6[?C@Unuop/s[*w=wDcny0AL`$~8 Ļ,lNŬ#lG<6P\㡿Ǜekzp?Q|B[L\Ï8~다;OOlaJ^IQ+==. k8Z2v~F /~xh@E.:U]_;Kv ð2*s vqv6ۺ7̧6w5?ۛwdf|~{oߎm~^Bk~w??ׁ\W+.)_a9uCC xNel"nzRkn=1V{%\o? ıjGU J&)g猣CYydp,R5 BZ'2; W;T~η[;b{=rKˑ-E4"8OTGq-,.BҨxR$%i)3OBEkMOE RHvnEJ#4`83ܑhLJ 9t#dq`8.Ol9^ 6i~rc 688hJ$XJ1ҔwHjOfQpZ?z֫үQehFgw%CCۄz(*v%w^*aijwx <~f@9˖tk cEew m[}Ռ͸dORv37z:9Lg[ TB[t>6]uX0ک5nׄݔcN̦I?"߱Ϟ;eB__/vqc`s|KN#0pΥdD ADlrGOxR:﷬t N[`Sf!^O`yzp=uOib4*E!Jd"j3ZN<ȤbH mZ?VWf8V_(scYzaky7švgoF]?uS6 ƸI!'Ow}IR9KC.'zP!ǂ96V R6*m2nF)'fF?-|(?63xsg'ۨzFř3Lz8h(&|D%2X<!SL'qy8kEQ#6ř&Yf9/jG#u o<ߗ:E DK VK bR.~rz3Y0.޼:W.>TP1 UU#jWOWF}#m0UgψgɦGSD @ S+L= gS,`lPFbS \R*w$!\ ! #\3ʥ "/ SO ֱ]Ώ(:•x8p#yÜiWzv7r`pPae[nrwvYJO2O T==j=ZOSj=ZOSj=u/_Sj=ZOZYlcaA"wA :9tb(h6Z¬+`%N,NBD'fǾV[ڪZ`kmZXRk+UMVk+JR4cPk+̤VjmZ[V*,1Q/9)[]3FHi>/sK)BNQh"Ԁ'q™&1>s*BWPnC>.$k@(yE5Kۘsg PRj"ȝ "tJT@A!T nV5bɇ) gSe {0+|z'zjywcjjmc;/6w} `s#'lEtRF"aUOxyYx%⹈TFmjkHBDv"m19H$(7! 0B1O#CRȃ5C"ALj"T2#E[c5T"=֦i/4gmcg}3ݟgĕr4ۇm<䟷 e98˿|nne q__&v(\SԋV2I8JG Y-?=gt6Kxnv.xoA~_y<QGچ=>١ڈuovݙOms5?ۛ9;`23z~7chz9>P(]gc~x;~u4ޡ]:?_A1`%ٮy~3[$ LPoɍ)4qq(g8Xrj5 XTgTgdj'eH4RɜId!*&eN@a$)n 8|(? 8rFq _I>,_ˋzz_ 9h)D` VYQkOT!'H )(9M8*1U*&h`PD|*gRX:rT@ 34~blWտd@ހ2gX/E[b59wͼ(Un1n1.sֵoѸkFMOM0q!<26]Vd>eI~԰oA{۲3YEBކ+X":-"Sjn ԩO܏v;aV{7x/^e (%SrSD`[aߊ~n#3պSϡ[b4fnF⺻w>N%w-?t/'hhlHQi_1*}!ٗ4P"< "L_޺v `m h1eBV6 W̏[8=W{٢txm%=Hs<(If)3 B$]FBAhpr ="{Cݳo ;[D#IfWxYī}ggpnTyW++#ǻzTF_ιf+VgI䚪%wLNٜ&ߕiEו]/ujkI3Qhme_7+5@|؜_wF5ܬF@d@3K {^SUsN hݸvU|Kȴ=UtǡʛW<_ -yWg|9?=y|t )âm¼Oe'|kN/jKӠlҶE ^u{ӳp=,x8I:6 A/ /5UX聰zyǪף>]tK<[kӭn~;׺Q>TS$FzMe_6gQwQJҿt)Ib$}"aD0r <U \Bk9$wy":x|wzMv/JղZzfZ'^p9*NEjw)Uj> D-67Ms uĒ cHVpu]%x;i. z%/xCaثn;}J&dQoT:mt p8"cKSTII#Ӯ8G^LY8Q\]pIʈ,1 ЂbrNN9QG՜-80;kF ĺ/oHA7+!$O'Kw\,ט˿>6|Yfgm Oѵ&;z%vlˇC 1\ !o_6'G_[ "dsz` @ѺM*}s{$} IZ_|Gyң =R?GO)ƊU+sWzll-=a8n}k{;&)w'ۻ3|4s󿇿~I ?{f~ "*+\⍋+c~?XK |RW5:ug*QWT)K@˾T<܆FJ\ll~ּHGH~icY;u/˺vj3u5k1=08^-@,odtGl Xo_j'LVV~`9ހ,F~CTMnUhaXm:i^鱺Vy>U}W1r?܋4R+^mlE|PvS@-PhTXVJg;nn70Cwa`0$U?u?.̕ y],KO&-&RCU*4{'dej*U(uvk$ !9 S^c|3Sw#Y(3`zM*jD' kZ;jRM{7WkL" FP%sZC>,Z]XӧYU%_C͎!uD.&r%k'w#g,w/>Մjqx 9kcX&G\9xh3KQ?WIn,¡9WU{  |4H8DCnbHFT)#Eb=m]# .wcBqu<?_[v)7eh":WxyJZ4Z%?cCEhFUzVs2̶!FZ~ :b_=&L:0%;Jg)gm%8:G/k\0!dmTi ՐR"P{g,R!I2kֽH>T ɢ' īƭqg$jCn],s 4hC@k "-@]0,9)u\x{ >ud43 yW"eŻ`4Ze^26,@5X[Sd`QQ\(spxiˌ#8 LTfS<ll02! Vڈ8VE QU`2.  SsmQ0gڒЁSP|mF$u}6 Z]R ,7x@K-imrRb7!eڒ\bNqMc [Ǝ;SIX+6fu꜀lp N!$\)4vmK@ gʬLAޠRoՐ?T2&@ !i,Hx@ LhrBa2˹Fy>6|nG(袱YHX50Lr0aL.<Lg T[6VQ r 68 / -L C{%Rat1/WȊ /EzdH_u4J L]TKХj 1ԛ2 ƈSn~, *%D&jQ8P4 ˲>PALҰ>yPժxHBӔe"¬pzx_Yr+ҙslHFrbp IѵINDB%ߺ00./l~xZ<,/%+!Qy3uVu!HHc{P\ZE3XisU$R#(]+F* @k6Ue lӇ'1HrB\6V#ėj,$t_ j If !w8tyXuq`tT5᫾&* x$; cx.cLt#{ס[=WwvIkMz8<}XA>iݿ]Bl6J`WD\b-=YbY8^s  \l9@wAKbi`i]J0o v0=%ܶKʭ87-tPҫqJM[ dLҖ (f^8-͊W ̓t&H "sQ Ɖif 1!n:S겱!k-)@B%a@`#<ߴ*ȣl0@9XjC @ndDhp+DtUYwnT`=@/T!H(,=<LyPw շn֛bX'@?z?`6TrA=(\BNIW]M @:Px?eQ0vQ)FrC^ :P5Z`p&lڸ 1Bcsr9ȭN|Xf=٨`-7!̇AFpuA۞bp`- ktt^;P]%  ND4xoQ B5W˳~0QJ 4d:N9;/Iy'rYмݻWcS ޕ&hWj kOOg3wCA|0C(x=dAW}diuAh8sA$KkR&rHLᑷZƱ[gMLq%VDabqÛQՊnuMtV ˕; |g|?M6tb&Y.bJA,*i"=Y=rȩJyO%au= -p'= {vvXu笊Uk[Qq+*2 P\U^o9eM! hTQї~@mZS:ϝQzh2ʃ.$Uiu 5B94 iPkcx1Ĥ\2$\R1?g+.{j b4{\4.굪TaOR|[M Eɹ Q^4/L?*X5,D/ɼ\+vDŽQB9a$oRh|h N0l9lQ$9sBw)Ӗ:mrqTl&n 8Oz?띟.X> KqjG2%ӟ&=Qg|vޙUɉQ2>;0u49G/|aSB25Gŕg\]4]^/^gׅa6Kĵ0gs_'Mr{ٷs Ej_.59:!ZveKB.olU3lu3Vlf^Yހ!> z0bޗiG.u7<VCju*J.t0Mm&=0|;xzHknxܬo>.B?9~Տ߼8Oo<~1e/fo)pBF"Lm!i܈ŦS ^7Z8_rE?~eJ߫7o,ҜKy7c5_AϚ[9XL9+a]ͥQX1fu>1qgj4$OR^K O;)-l9*UUivӣ`Po]UcV3$#H2h˞\σ U:W#/7JGDHgTWH7dSrLi }Ց',kSkWHvLd3\gWhbtʅ4el6 X%K2P܄jej˭7ފ{CWc@$/? y K%e$/#yH^F2e$/#yH^F2e$/#yH^F2e$/#yH^F2e$/#yH^F2e$/#yOj #2y'/O8"ykրvZ%/L%ʲ 1%y7J!B(4Ux3ȢFRQ@{n$˓ ;!ktK D,{њ) jHE-lYݖ8vgKX n5!G-)slZqSegx' 0>v|c*ZfTX&29Lۚ a,#=*meydZjќӡNc`e,oyC<3Y'ZYvߣcܠL.fa-<iP7:OƲ*hi!*U>l6MMXZ{񄵂kgѺȌZDe.:VSJS!0̰X>+ Z ¸yI ])Jdhbt;ωV d$ 2G+cy##uIT> G I&;D%l9H`:'.KZ[{+" [z+ךlj|eco7kfQ;5\P`]>Ûr܌~`fE)TQ%4capeW5 :T֡"(O:_~#/)'<mHd@bg;ˍE,bg;Y"vE,bg;Y"vE,bg;Y"vE,bg;Y"vE,bg;Y"vE,bg;٧ OY΂F vz米 ;@7i@X @X @X @X @X @X @X @X @X @X @X @X @X @X @X @X @X @X @X @X @X @X @X t*d֣/Q4~r!8j!hf"I!xާ%X)\qoؙ%֪/\TKOpia-T 7*ؾ,s讞R]t@q>q=g%}_Nb<Sajum^_1JjbE-:&IIx ߾wH%;*3Qa t~1oRxJE,RrVh?:Zg9 M?A!na6eraNU~/w^:Q{L:7Qq4:Ί 7 MzEW6#xeqoCƄ81tF=8=;,^7xE_߂ z5ԟfvsz.h}*DgƮw⨸&pS/sY d/߀?I.$E^ )n{Vp}4.$僪h"RN͛nЛfw6'9Wz矹󎬒ȧQDEU@l饷e`^'Sx&@`4襲fJ#UҖ':WVNWX&Muo`R\X|fsRw#sVB\rpыoƵ/i>ŝH}T 0.V?AguL st֛yYxMV-GYQK{v7 YֹWK!QR4d -=dslohQ)Fi\4LY:L-"McPq+Ѡ*n3==-!{5,{H2D<]ք=4nO=坤cڭJ{J =–҅}T?{m <])v)Pm+Rr})Uu*3U10!t< )|(AHHѲ0Ѝy_캓`4Qgu[e4Jg9( :~ՎQ,rʨ((,n8$ӏ_)sNKsS3[ޯX[^lԒ!4Jj)` s2M)W^FvB*e.9ŧ{(憦;kcg|ݬ1pӊɼuqay`{.~stɃr^~ 04w}Uwh%]BհLχ s68tjKR0M_miwwݘ&zM7-SzxhQ(aI8 JtSli[lD 7ǯ6eզO|S},@9HH9%J,ZG͢+;޶|:_m!QbtCpCrHWt wYN:_#v'fg-ZU2jof0&ם<7}kkE: P H^\?l\9*`BQ>~ګ]έ|ET;?>lzO8eB]DǹN39k+FӤnm2l<ҝMC sἣZ 6#ߑ]lMyhkA6 1̢Q^< @=:6bo/eqܗKMٮ_RΝ`KOSLJG wť|_U]wWYJa]=AwŕF{㮲_f *K)]=Ew%+2zoUW}qWYڇbIJJ4su' 0/oߟ/:V~wtbʖT0B&J-t"q)df|phJs3vZy-=p騵V`ѕ7[QU.^*eCE0"y(Y|釟Y'49\}U0QuRMc6rڑ0rcc.FD ]DYs7eƧ:ݰ@gΟ|엗vq ?֧j_w\L2֨T&s:Rw߻|IBs[L%ӭ:O`8G-1%HemYޱpZ82e72/+BolpV]`H{su;z^e^ 'jiI|\L; zs\=(@us6fݶn.\E9uOd7vU3#Kc _²/qFU*h[&#R)sޖ*8V5Aew@~B,љEpx}Uf 6uv&5\O}uJq% W}x\oӋޤ5r@[Kֿ_zkmIఘaw;ျ u%(',jz9GK-جũM=).&䫞ޮ?Jɧ UD"Ht,۔qh3k}+鰚\\;1Ġs+ʸM@r4W2$|kD_J}aq^20p]~V}3TP=\3dC l[nvpI2v!,ze^fO2Rle]+"Q `HmfA@Z΅4A.jT F0*s0Rd3ę|T' \sOP72_Ј߈Wٲ7`:8e^wZf/'"7F_OmegT3d(]jjE ovtb~}7n==-kYeni%r?9rx¿yb /^үgވ=~HG/5R+ƺjp;np}oGz2ޢ8hfƎ(_0\)&l㩘'01sdB ,:΂*zƭC/" O%  nF}-IhWf&xI C .lҒ&8eהmȘ\[攑 uUz`x`"'ˀ!訍QB >E S)(f(f&[IA!ijU~kj<#5-v_ vHg-{d!&zQ#"'jD@l"DP"4'7 CIK &Q4* ioŬ~YX7E>  Qg7Ć!d!TijI7]Exٴ]Pr+}JM~ќVoqxuul.RۦgeʏcjTƫ9E;$jʋ<I6LwI% <ڍf\rcU^F51{]L& y/jC(LwU[O.'OuUaW[=Ϧw7ah&F5qLK%lԄjwu=ZE-BSePsK` e>QǕВ3F'zgB2:Ř'2&z `z4DEwJTid,&~dR^ iF.%6ʎXd3%Hl}W٨yʍF/a={m,SaDb̓hQoB>oI IA;p9_vvC\d{8˰ɕJxL٨{ĄaU HЦ0bg?b89Ŵ㩨 Qg;d~fx}F!E9RP #bp>&I@s8q"OSFD#b%JH}GHMJS`)C> L2T\DԾcU{K--U[bF,^TIͧqd/Hƍx"Y A*@Xhd*AVSWI3OgYc'RKP-u(3^mAZIYIhhŬ 929 N"B}K?nDu%*RHZcLH{VTzD1qSɘGu>b 2ֻܱrcsXX'uC9bŔeh/|˰$·Z2 e* rBITJb+/g^YZȞ~H/=.Ib=%|/;NL'k pZ)CTQg)po6"h%TInT5^8Wwots={!(ѲTRwMGדqcS5 +dYs>x"FvBFvF|`l g}n|2!T%`l)T܂G}ݭuәYX;pkwڸu]]@4%#Le[tP2hoGiTԱ :qE-8X*7YRep 8 ]/Dt)*ψA8xkr9elx3#~]8L+1˫ivZ:_ G|gI LpE:z Snx %JH ZS崕R\L1̣PP\dErsv>xɗ\)N9{7ܵ`墳f q^prVbZ!Re1DVR ~ u ql$ƂE\p=QqB,!Mp"22+!. 9n %u4qZ0Q9Vz#0/3b)qPtB,H<}kXVi;G- S@Xa/(gG/Ď$p(U~Ipzr; /_ Dlz3*AlQW'Z*Ð>\'Io#~K0̀/ !`l{X=={eIRe tjPST(a(K9z{)s9Sk#f{0 hDA~ E&763s *VS|ԖhIin1qw'v̌,0T86pzoJW-!Z&e3p⢁\FŔ$NA*uTFEN|XdI"RJô搒tTat[{{Xnގ~] LNXBCXDk4-(.h;'!cQ(TK|Lfh놱>h:K1,F9OTZ% `. pɣ!H3*lPP@/!EԿ^ۏ54=ĐՎxG % ɣ.d R>/yvis!6T*7G%.F0%3IV͌2;yieK &_u` $Im ̻)7P߸zc0? ,4~F<68XqF : rn٥ъp!^VſFp5+ wKrZRx2 f-,OqcD?5@s/.g7'FQ{[.њGo8V\2|_WT?6m=7aFoe~ğogWٷ0B] s~hxy5[nW}(BUvp Nٻ޶nW|i px ,R7t~5V\NGW>"Dzs\̐33$mcK(on鴩hnfV 1᨞h(|4\zl?9Xy꼓M6UqI񧹍%wq(U_Lhʽ?7c *҉L+{_~8>yuW'~듗O8oN_U J]ENךћFlijo޴0 bJ7oPk,hH ,WbB58 INj:=w?YM2C.fjpL"n]#ҨiCb7 ~Uq_@Go*u8Ur!+XWZ&yqF_l@>'M/3z l%4 Ed1!U~&ξ!MxJ^A&⑂GF)xP:ۊ%T,;P!.ёOτUmqGLh Kh3B1:43^:MҚ'8Za#ـU}$nZ.Jj/*tb̯g~'W!j% [=L@lVbZ*"[)*C4Yu[u|zrvwv)r>PjKWGN?P -qh7uiFyV\ٍH$D>oKKOh4ĹΕ9tȈn%\KK?ƻKWXQzlKPŏr>A "̓V{Y"`dM5@d+,  Ynk[Nyz١\dLي-5N}W\kl^Ibb/R9dT] <%\JFn%FՌ{,GSzN}rg6\߳rGi 5 5*TQj#jR&+o>%rGKkL!D,HDγuWoxC7 {y #]D4뉳9ò)+Dn.&鼼O0{]Ǜ(Vօ:hؓT8֧>#-=VpLc ëWJ3tRdЪ hH}/<~l/E^<>y&OIqDO4[vwO8?}]lzI;'.j$Yiia"[*vVWYW#gϧ3Gy-AY_'}r$HnYѫԁB5Ű|=g,cZa6-/jgSoZٓy5;w/t}N ~79\LezTv}UKNA<<6.Gㅨgч~fW_)}eNΌ5M"ت ruO5|6xur%Zz٪7|.߸7|.߸3.߸7;u/K3^nUAlGMU.Cz 2t,뱰,Q%UL-x<> 2ZBo2K6HrP"8p \@%64Hᙐ֤& M`"|oZoGyE̎w~F>&Y!cٙГe;]oVljNJXHiX|7Rń+C~{izٵyTjĆJǽ?{O/?:˓_W'7G'ǯ_Ҫy ڄ;u5ϣ77=uܷ47oZ\ouZuMa %ޛ?_IǗ[ )A*2כJx c/οBڟN7'0i@VfL n6_DC߭kYu8bC¯J4n8G&>#똭jsYƺ2#s0bs0?i }gd2 Ed1!U~&ξ!MxJ^A&⑂GF)xP:!N@iC\DG>wd>V=ϸbm&uQh0FfKIZG+l$ %%91.Jj2ZӉ'r.G~EU [=LU&rvrTDl|B r҈Ff0B]nՍ}˭({I"Cf/y^ef0{9$VNl|H[@a*$ۅsL BE$D>oY9B5OKOpmùĹWѭki}癿(}T\QY?Ghr4B*Zfx%Q4$ xDH0XXgYn~ܓR%cV%hqZ[F 9A0U;s,;؂0ȉ,iB!ۖ diI vVlgNSAQ-.VVx<,&'@fP!fL^lDgmzy_kOXvE05F%ު J!+ǰ  ٟ)j jAǓ:6`OŇbtD)ZP9 dd 28Z8*X֚N<`Y3XE,IT-HW 2} Tu!O %ٻ6ntWؼ_t4m6h="VH$qrf$˪Feʖ)Pr8pȇKABglD&9*9B]VD:$O".Ln3󯶾ջ;4b#6;‡2 $j31ft h\0mr0%;G0߿z2&z}I+#vnv0)g$d) gqG&Ƌϕ:zҫ.TOp!:?=L}Pn6Ujuk(WɋnnneUР{EQ7kJ .j^Zbgv9Nk_5B`x洀X܌.' QǓU?6j|+-QYZѧIC~a{KnӢ$  5(O+LwۋqZCkp.imxz :eJjH9Xpe ƐBS0)vތ G EέRȰ, z#-J)'R $Z6q1m홝6x99AߠN/}6biVEԞHS wSpi `0i/ +?0z*edژRKK XSIGs#9gGt*✕LBNL.R2NR) Ѱ QEI5(pFz2HxljX+2x?뮴Ѧ)X0GHc$GR; 8`͠I`ØNy/Y'1,'>Eah CRa"Q"!0G,s[&I`]M2Ȇ}9͎%NgxҖU!L']L7!Y*]u7]FwU Kw0ȠF 8UbUťV"X,;Թ}z*lUEXct?S!@ϱ \ ^;hR;)@@/}]pLBs˫E<ݏǠo7f^omjyJV3xJ>hjRvuREYo@eO)Z xA4- b 0xDpI/`D3> Lr Eqi&A蕲_ݿC; Oz5&.v7ַVa.Qrh.x6hϘbqJ#VX EaXT0@IIBx;dv;iy6:J\ z:]![@$0ӔrM 42jp5yrxqX! -#dz)~R[E(S9Q Fb d.9 ngݤowE˜ݕ  m^(f׷]L|) TZn8[RP-MKi̖[Bb}Vyb7gr7 Վ8+ֳ~v ſar/W| Ο0ɠFY:7nLixx&;uƽƨ4zit mMit`ƿƓ{\-R h WѩxJhw7;zf_Wǿ=5DEuXr'6|j:Se/^f aXY2$Ei%`s2 "2Ir!Y鴔{?W\Ջ7˽:I{ZWMz!\Eÿ8u̗/(@0J-1.Y AR\|o.Zt? 8Is _I9P-4AI8OPmVS)RtS)R{?_ Va.EwiެZZX?pﶭ[nyR:0|iIm-T}P/Fq oEm!AMAff>K[ɭ=',/W_*)_'v?)7>NAVk.U7n a oi-iDZo:IEjBǵtNvh &ob2~ä s41(༉{!Kx"D7ا\&_>Mo3IŁf(ڿ{#J\pFnߝgdzd=3t;%=0ő %6I%AN𠈧V[L=w1e1$1^.0?2 gwKbq#07{d yc|94?C3e425JNy,b<R&IJ<,3FԺγ͇?tvF[ 0o^(|kW==lz)EN\O>TQ%mj'ෘǓ^oC> ljY0^)Dy" D=Ϫ^V E|0 ¼rroDeV}v ee 2Y9v vq-2XRKvsʍ:+*o?ma[ɝPjᄡ@Lg#6T0)Z[<׬4x  %ZE@'\-&fd j0DȘMȸ< f~Xx'2Vօ;&Ll0mBi?}刭 $Q O0[ޣXg}`nBt*8ʝ7\ 3f4j|9#,RFXp4{o޵rݍG#_?_bSW(oC1U'a1zia5t( \aܻ}2|?][_օoqä05I{V5%6>5`W$ٔ+4whC:ǽ ģ\<'LFII:T„赔xiPn2Bm#7|1q4,rk6O{8Pu4Ec1{tk .hsV\ ~I,E D֏ִw;codcVr"r`v ]߶7+7w[f  !&Mġʊ c58@rIm tv< 8G3p!r9>';r:":/SAR) A6y!R"ur)\ІH' |2$DfvM?mJiN,di5#^E>cJΒ}*,gR^5OFM:9ZY:9ؑ=; ]{&.Gpv#| <ѰIBcS " $F ( [<LR2|4Ps8hl=RpZ?{GZ^A {'wdi]SҖR=Rs:Ys$dIS?YZdə,aޫg̚O W;*Վ p< \1'WԇiyTK*̎@.gWY[WYKZzpŴpNpp6p5p.pJJOpB\UV4WY[NRp 4t\J^^ghG-fRpfyxis_,n\ ?޻ tACxN e,Ze'u:Ѡ׋ JEΥ"/U^œgnzw\Po>)U,94W*i ISYԞ1NvW NPb2*HK >D C{mAd`S0 aU>9W8i\k^<̻ o_NlrhK(U|u3^ UXOJՁZ^j1,Y7+01 s"Bs8;Ӫi}pc{Zs5eÔ[Af2_u!uW^B )ҧ_ֿ1K,c1H +5so-X5RgWK/쀮g!+%+8*U⾰Ha ί9)ՠY)1O:ju t 6F$KKi~?j;M'tQ#jm{p-9}Cf|ֽL!_Z.! Ic6[R-F;>Mӌy[|~DjJhSQ*B">ӽy"q4 Ѥ'G4F(I:d|RKg::hq>w(.#67zR$ ںka@Gϗ:Eȁ>94$O lL'%|J$~$'7ùR}VS.=3 8M1T[$#Hm,#[%0tesӞ1p Ng vૂr'*[]j?7@SuN)24h*_(AHԖMp4UGP6De(#\)X .)JraHB)P5\**Fi(Ebh18o3 RHF% I3F8 ԩ}#(O>L (G~131+)`<kAVxU?'\Q Ϧ-gp}уᤔRխ7^m%76$kW m&BEg5?>eWKX.]LKoozWfͭY Ē,fM{OvJn9b{Ͷuw9E,ZKjTҊ5՟ݞbS$yEcs5ҏpe|%˭͟6g 'I"]⨀^_\7`%,x# `+u w[{,LcGG2Ƀ"qwRTZeJ&ֲQi(\8scc IۙT&Hr⩣FksNSRݘ:ۍu:%TgVVicŋ1a 6srOXI>OD}\B3_bسX(0Ai¿&.2Eĉcma3ĉErjmvT6. m\ظ`j'eH40Gvd'1YhѮDHR@p6OsSD>|-sCzWdw΅}`\"^sR@#TsY('*SH -(`>M8ڔK\c]q&3V r Gu2D7Dn(aQ~f)mzvKk>*`XʩC:B[;%ک{f}D;< i?nGϬ K = #y:lv3O# 3o u)(Ӛ!\k?]^NJ~|~>>z]Owl6;fYFOWxh [0I kǿ.jcreZHE/Ϻq -0<0L RBs̀KBD}BYiXT"]_"߹ͯ=Vb_Q2=. "73HBo īͻ }&7}TMw3~))sIe.v':@_bm?Ju$~[?פ Rŏ٫pU-{*sC/di3|[NEO_/Ԙ&0{#NWIP Ugw.gLSP5;rt~{IPCjW95{vsru6ʠB-(<>lna9s3R=s_ _]^;N;/vqF^I͔$z55]~r_ `;(' t}i$uh,6ǪDjZKXTCwF龺*PgK&nH-wp>=ۻ$CkTzPA:kUO6 $WT0ӆM".!riXznW ZuiҜҺ{Xo%/4b-1d:@ٺ-N 5k:QÌ=;+*"2V4P5 2I̳xMf.m~r.b*rVev[`mtؾ0EH "y*ӄ DB`*r!A &-31Шy(6̡yIG4ςO!ZGp8nk T:RD>D~|E=9ZO=w vcd_S[6ljS\sUFQyn (R;n&l\9ɝ&|:C׮9V)$C%}QB2Xg_D AidlLȸ< 63Z,|XurW++j7kIo\{#°LI@[$Dk7qFy9tb(h7Hq-a5U5O͐?12lrI1TBshF&J7)u#Òqy*XUèZn 4,Sjd`^slkۑnz 6̅ P5Q9Ev RE6ʖ`$Ä*{B4vV4G<&!*lQuLT(X;km"C?hdzDg%Sz6w0x^I<4Z Eu[אs !#AL+b^ z8+0].FeUN{} U$T!W B__GkPhP72&rBrǒ,FbhmN2xQ !t\Aɂc\+52o޵r-Wӓfݯ`>S,Z>lT@*\ d֫SikG]5my7>[ *%rܷ0ew1cjnˮ&=@S4֗~Ij!w.lXǭXPm>#I)[ǥf|4~vj/-͜ i+@&zGw$(v((v"Ҩ !bQ]H(mIeyf:\ɖRY'.~ Y-yݒAJ3k86Y7ZyLx79}Zt2N\`c2/sӆ(M"L+C9v 1IQĽu*q itk<([t*z]CQ،#ZzYysY@ǣEnY,>d_()=eO䫴gX%{YN v7}NYK#_1uI 4VA^Lk.9+M ^ɛLSQSL||u?:G$){/x/ruW?܏ZK,}?*EMf(cdj9 9s?wظl{M*:P@1/b@_F:NB.r6QS:f,x8i! 0x N;;e@i&<\q?2x{]\_opc4ak|@lXRcEu r%Xil}NUh$98#"iB`d%1D CtI09@6UJ })ȃZ#c<ˀ}!!fSN liW܎V.%,-wkWsL>dE4iGۻG#7=ަ y(.oؐ-n vˆޮxco:I hSA;@#Gޗb("`0 0HV~6:uggsbF`θVtQip^-\2)Px,ţf؏gE*1$ʒOy-v";3بaI6]wv,.F'-ey%-?ʠJZh+%fo."$mu Bd̎e">̷A|@.'Byۈ,uR@LDYrJ![GYù!@OZ";AܷcPĸ'8f^q.ƣ^XNe2DN9NIK)3W.ŢD'@ ^{ Em'3'WsCRtƳ4ӆ9Z$1Ȏ> 3iBndqys u,H;{KECp?;(ϿG\FU#ӳԄZoɛ+%"rz=f#i4||_.*0#`&7ԱLj^}Efy3Y_%g1qbZqdjrAJ|\5=7 =g1 AX'M+x.\ËB>|v~iC\Gthff-&(\XY$Mt 9G/~1~$[4_Cs,=Asy 3ҟu^,k>'9ӓwJw= "bϗuҋw~83cm#:# #ڇquEfyCp8u`bGóDO<=?;]UG=dۨϪ9I™FXn}9!%"~`$j}MᦚxNA(Gl#-29`(o5@ Y[E9YV 5^hcɗ!_w::2ӄU-bd3GeFXkbL0٢1uK$N 0ENv*3xW/E=g]-c>q}TV(JDU@*+h\VHϭZKٵܪl){g#Z9E>=(#\tfr'&tP""4Ĥ##8.-L0Q۲A#IȢsB4| FruȹKΈdi֮ ^='tjSR1o > ɋHn2M'b:9Hf©-lpͣ@ D"%3ɭervWM.;Q(MXX%IHA)E, NAeO.,U812rKδІ ‰l;&u"+ ̒j]wc9댜;E"70*ޕ8&"ɂs2J%e5%Z'H6<3L*. 7EK\ s I|V2<8T Q ,VCΠV8OZ bCK ŚJ sd/3pA($ ,$^qt8vj3x&e`Lg$$$^BQɤR'ie̓F^ںm%͢4=E6>ձͳFnx[ \"h2diW /xrg\4s؋Gw1cU}C6ݱt?brR kgtFtIU_RM/y (5(i'>䑒M H= ^T_S`\f2TO*}|NQ2Lzn}_ֺ7]_~}xv6~bem iYvY|4N?a*$"+ rmWSJ6$}㟭3oymGg-`, ` o^;۠5hVjɘk:.GqOzB~'gC=7=2 );\0' ( s@F~ f|^l6u4uxQ:>u\&gZ8S!vw^h)Q.9{MG^ɛLSqJ~&e|u?5l#.#w!yd˫Ef|B/$EMf(+ &"g^"gS#}7qtҁb_")VtnN/]6:f,Lպap9T Nrvyr: 0{Qix8i! 0xՠ>wٴdR{v5/?{Flm@ٽ;y0|Ȓ$;q[dqKLrj6YI8=h݌&KUǓu?O+yyWaCXgʞd.*W<$J@ Xj8"V GueCT2KNBVkM@ ׌rxX(Ebh18o3 RHF% 9>F8OpfrOo}Dzǃb瓛:qKTfygYOƣ=,-.3*+1| z> o/z42[m\4/gGͼm,O'+ܿXuϭz؝6,Z|n/v]~!ߑZ(zxFk-_}m}ޏx8=&FH>xh ۜT<\l,],)mELa/[cIA*(mh3"́fHtPVHAk׭Y:x;a(^Oѓ[D1γ؁-6ǂkCm7xik&hDpq*ES!i;$)BEQmR3.B=!׆v[Jp1Aͨ -1!ia|P ͑UgZ+Q`qnU pv m[t E pnw"W~z~# p5 T:fyYhZDU?^}0H8(YxN>8poYpjR?wntceZ>}rW_G|o>555NM?D؜mWqVgiݟdͩ2 "WDϴsmf.ƈ<4"70~SB3xSgr٬'J1 ]s_!5=6u.0KSmF_kuT"U4iv$QweDC[j鹣obj~zΖܾݭ_1ozFqFl>onUo5C%dMzMAgCmT%Nz/>o/nxH`` n unBkO=V<-a8s3>sߝAu)Nu8/iFUCŐc+It55]rqXF D:u/Gv"nWj=$Yf;6nz,BwF鮺6eouN=ٰl#Λ{d:~-`mGʖO_lVG-rA'-r0/݆-DW|Rȕ}>&hEGLKsL[%t@a{]ʠP] {UH&1Ӂ^Y YP*gOIz=;ogitZ:hE[$Z 6a;K̳ش&Oyv`oLSv e9+Qbټu[:z tlʶ+ s)Y-Dd:% C OJ.m]鑅/PLFox?Ͼ41UK5 G|d<T )!$ \)ҁS@/@Ϋi]~Vc;d}ԕ_~׵ GU_gj۴ ]gJÑ=wa5n_Ly*Cz "JV(+ .@0;Fnv^,.ýǛ d,@M{AOcV7Ғ| -B S&TР&S \tDl$<0iɅ4:Q)xIG4ǿhQC(t"8tѴ/LDxܯr 8ٕR”ZKreb_pE%(ʛwU@QqC 0Df3 xI7!2[{?HjBH-,N:=*$X1oU1H*y9J#c1u6#z/Y;fWyUeMeoTv0_m0,$@'Dk7Nl}N+X* wFKue 2,&Kp3)fJH6`d1ta.fvaX!ǂFǮ P`V X.ڲtLfL03[HN g9&)OA$ v=C Ȣ= }$/YdKtBx$QY[kR x,Xl|슈0";D\" A (=eNih< 1AGt$$2" 785I*ʃ2θ༆(G=2i[E30"Sg3"/uLkFɮh pţEo](6 j9mWElR뢼g 3F5Y?&Gcc7W?G?3P/F_¿j~NJ9 ?hr4ڻ?w&i>`OӶZ?mLORj93Z3I8jVp#vVώwY8PBPTEU$I8戚s.+Y%GQ2 0d/tsHVD'ȫ2Xx΅CS T&HxZĜS?SRJ+f2QXEc:m,w?뚄!*ORvU@qX cÈ5Bʻs 43xD*-@ Lu^Hrb) <1T4ɜBB rklK>o:v9f;NN٘h PR" <XC#w&@h)Q ZT nV4?OSzuz^ ݛnsm`Z<[32ɉO}l >u~焓NT)C3X$#YidGȎbQ?ѨMxk\\D&"hɡ"@]P8@X2+2cY_'7ǴTX8sO\?l5?RKk;.}iUB BLF4hTˌ3HڢsIT-EnG"Sʒ>P< igRi6r:᭶&H]Ҟ-U1%hTɓ҇ق;FԲYp{WQ.aNф׵Y>4(V/'.y auY[Ï=p  C \gyƟVv0' DC+U:V?@[z NXA{^A 8DiV!RhyE`yArŒ%B_AGZ2G_?gyι"X+zk[9 n exAph/iq,BsD 3O:=":/ JQf 禲^ Y˵pAb"&ygLDNOZ#U!2+v:/#`[\e^~Kpyp{pL.ɱJzGtKkW<:>uǩc?N[gQZǵMMu_@FBߞqFzfwG$HP&A NggmH _!x:~I "QR,_-i(I4FawuwUu+jIB2$PXY- i{N{;\JXnN 94S4tn3HiWCkwI2K*4u 8  `"ʡ˟Q*exve2&'*/U3S2vAhqH瓁$l^lJ줍)w1@oP;#Rj}G{VgfUCU5+οy֘{Μw!c[| 0}bx@eZABBa'iY;q3Z,,;ޞIt5ۨw+$pu\\+R슦s|7Ohk"S?(Z`7vGܰ72!]<+M6f=zajр˺O.C>G aŀxL៹TNV`pHDZE_\2w5?_Ca_w}3oД3L}WբF?M!b!?! P*!YUYsUgcD*ޝN'1?Nډ*+$X!HsR\`0>ߍPR}8w m*uNy$oKEJǫIή6U,h`QW蹩b mWIrk8m p"pRGm;5ڮVFK׫ҡ ĹeJ.+ ZdLãp&h4˔BvJ_H|PxczgBȤbHPQ[eay9v_wL:8.CEWk]ϽmM%{oFޙthg>-۾~}|qz[ f'R(lM9կ.v#]!d!J՘ʚo&8}u<~/. gs>yq.vPD6c ;H7XL6΄NinfY!4DS|S߁NK~55B Gq(`oGg%է)C[~B ^} '?h^Ѵ!K!h y0 f!d&pkȦZ':S%]ݔي8Ur!+XZ,Zn&Y v{Ocd@)}^b &2c ·s7htwr~JO+ȤxQJRNt+%tԽuƶdDx0!}__ R*[)`M0B*ZfNTcI+,`tZXGQ,QpGgs1f+r81s-i# zEWiov;:("'nP{PAFPl dilRF"P;+Zِɠw3lATz/n9E^DV,&'@IBVl=օlDGms)ހO?1k֢ ѺFƨ[A)d2 H0` %Ȕ NjM NO^}+Α44P(Y*(]* рNFݠy(; +~k8L|"5IEЂpI(cI6 k;*^su\@ˠ,"[ +$B|ȶ'[&ya-iEv>cri 8IV۠SQ<#0VPGk-uCD#\ }x@JURM_yr Qf ]l)3ykwAŮ9v'"Ϳum_Wׅvoh5eх 9XS˶F"IDqӺQūMwC3?>I^if&s@۩0OX.(,$Rp$Rh5D]" Ld>6O;ߐΫpk-}ej+' qvˉ@.@ɵ kKO r!YA[xJιQKHg V|ţϮCCP, *}=TKK{ t<(fK, "B2Y≜g7nI{N~8G 6H܋2{wHݝilD lD`lD\ChwQrɒو^4*?յEvU5PU}gWeرȮXP L)Lq7}T S8! 7P nyFx6WNknOݫ܅+nK1 f7՝svLwDž_7fz1V 8~VRg_$ &x̦ xeJU Ͼv̏hW;,.>),e`{JrRw}zժep}kQxR~Ow^47 ̷E׫ߞoo0.gBnڔXMx_ W .GYfAˊYc+(+bRBB yB]E|}v_$6/N-a.ym_9)+$QEX `|$zg5.x(6;t}2i~!>R)+Dnف8ĺN8-^yk](|*B>j(*T(vo: 0e!<Gr<8Q`L\$Zarrf҂bV墮\y1sET]NiHiPZLBbҖ踗6Fn6(rLBi:[\RR2anS18-GL2 4. %^-+-a4k-å_J ;Wk8^+0uΝH "^{chz3|E=fc:z8V)kAmW"Ɍ R- +z^y~xAv7g}Sl|u{k;G5͞,,ּ Ώy  V}clagyeͭRPkmHO.vErnpwAAЏj[;83A&)YҘ,Sc$,6NT1iO`b μ)8 .^W2cHvAm9~|~p<^=$8k_i\G+G:H].*isnXM]9 ]:e; "䱼 6S2 A:jci̛ >E\DC'Hee׉o~IW |!nr ;X/wyd!&zQ#"'jD@hDP";4R"Z@eg2_zZG׆<,$<4:n MQTXΥ `< IxR<]-xqQ!3#Njz)K?Fa\J$%XJ4Q%9X0Zߙ֯$~e]-:+8d2MG-.bBvėE6'M}Ɵ<>%([# )b"ΦT1ܪ3&e^&B!=ݱ¹m_O7p(J3Xƫ~/ϯQUƫa-Ԟ B2+I1 .TS{W|l D׃q=hB[C0Nn> _ŷqx%gVCcu~9ƣL8cd4 ;~MC9~lc q7jgKuQKg06*BD0o+a"hD%K%qȩEa<ŗ7+r]}^><կf=/L1w%nO"NՔ,=Pj ѻM]lR8M?d}hjHF<9EP-ǚͫh:z53F^ ^eR㕫wşcXzV5[tܥꛍVMKϵ[N6Gm[V8wш]=$xB9QdybRT{sUMM5S%޽ft[s)@nzP]1=gnFgapi2Ay`΋b|Ud1rƵ"u[e] q-pirF-cۼޥfۭͪښ Kla,AvF龲&Ŏ HFb-IGu8Wb@$]9tzxZVO_"ѻG- ŞԮliWj6%J޽SR̝U{sY1w7_Sg%UC9by3giLcf#7<_suǖ4=Ju'] m.r41 ZSg6:9"֓ (RrJbJ ~BA)G`H 7ODμ5o,}ue'_(C^xX:Ssܥep3*JsED1"{D|04C):R ym ޙ($H$&V@&)ZqGj&i*At=#.l싋c\=.ЈGCrM@ mĕB $%!$L q x*xWwM{@ؖ=Tj |/CQq}83C*<2g Ad Achd3*YzO] $I< ec'P3[Plu5)3^ g%e&AKx3 M $QAJϹt[1DzEJ"20^&j=2 >`V&{Jeq:qXWSg:SK2,R]4Pcah8 E,mwJ""[z4dR0,{`g.k][QvtH/a㈃Eh .Cth*CTQg)po:"h%TInݒo/X~ =Ibq{ bG(ֿ\a~wso{a~<lq,û,|Eݽ˴WdlP11Yl3/w\^X1לnzD VwڅK&TNt0͙7GWq׏cǫ9U)Id*Ysx"FvBFvF|`l g}n|2IEQKjbs$L*Rn#'5߯UfV$H͎YZ[wbnM]2*vȔf<:b6d QBzFM3%5h`Z;:"u,F\rȆ7YRep 8e&laiRYfa"nEK'U%CK/$ zU^ޚ|ʊɸYx}nx` ?1h1[Ai+4Gs9hD (A(N36Qn}IL}8g1ΘV?3QJҦN]2L dZT(5+ի_xS|xԗ};IߑN?(/N #"qEޱ;Ǎ4ᣱ&A 6*b"G1$y:ڕ8ۍO mbo-6V?єhFJ.t60FrFfu<{w(xjuŢng[-~ 8aZL {k6h&'O'QkʺSn b.Xs^K8P\GE)gT0^ـԞ~ It?e>C?ڏ55TjGWDiꍣ`q,dR>oy6is"uH{eŹ1)w,$Bg#(&~I~~kq;h-pkW Zs˖="0/&Z<$"I_y狲2̮]d̬{6$p_vp#uWWCu.!}9C5E*$-G_(CHIĶiU4][I{Foгס'~//nFv>sϳޭ*g"+~̅s0l=Gϳ:ݹ0Gƚv'۬ .mnfX>#42Nbǣ򠧷{. /+]muM6Hǟ?'<%G<+@TM`4|DN߿~8_?r0~|w ,܀0d~7[o=OLbko0x5ֵ77wMMPB|j/V2yYZ*"RT> V!P9iDE'e !>j-~rβ읭py9EǜJ 6{c*3+"&Dv ( Qq6uiMy]\)&L"YYҰʒm%ztJ 't=<3tV =$cDweS Ql%(⣜h|tLN^A\E=,`(j၌ xd`6s;j@qpg's1f+r8]q-i# zEWion 9؂aƉ7dL 2bl[&T!懶REvVtgӒk)@P܊.A7Z+X,&'@fBVl=}Z= tW3k֢ ѺFƨ[A)d2 H0` %Ȕ ^L ao'v=HT|(FGy ?y@h@G'#nP=DpT3x%f`UnsMg'ER ZC"X$ H&xڎ me6Ι h֝Gp+i}CaHd[3Ma-q#DЖKkl[n :U٣;Qƪ0>B1w<-kp?DKzIXPjE,d=%^˄0ɱkrfl%;^˕}k٠A{CYgE ͬB Uu1W1-;k ^ĝ߫ypf'/le{2 FZNQuvu^NT$}y A]q`]j^Gp/awYj]}?5J0A̘$"j+T`8HD3ef6sT3vhYqy_;nPyf?]OTx5ltF*uN.1@R 1[ fV: }EPL삎KNNA{0us4Z3seNM0 ` Ή\nܻ0;2J ֕{rb%~!s,Zo˛Q{L9}|,@I*U U) H$yр4kgy QW*a,}Ze/BAѢU+>?'Q1Y~E/yP̖nDY-E02e' 9;+=nɀMt3 lиoVѺ{kOF!/( ċI(,jR [ p Ka7PhR*=4| dVX*LsцJJQY kCvt^T[~P][]vcj@6:>kbnѥtE \(-rBn\JJf0xmJ#>刉Cf1ZyeJktUұQ:-J͡8nhΤlr]hD J阛\060oY V0Krq>zq =0k-QçT VmAР?!8E/8T|ft@oҒ)[+Cex8] `KgtE&A|ğg(ܐ@:!*lFbb|I]}Nwl$>tv%d0ELOҟFie8.Ɠ뒛±;ŽV܍p-x!,ufaYp0SRw\M}[GMp9\-z{{p(-[,;24P-Z5. a:'EbZu6Kn,+~A Fs8͙g'/"}lsLxTZJ✋AcP'쌒tT1YR`"k^ޑލ#}z`Ce u6 agIk,Fgǂ@zx(`U)L":^=E :9Vs=}NЎƒU/=Ct֟7vSc˅;nd kz^I>Fgi ʤ*j*!QZLЗL͒i௪dzsW}&av NoӄIp_0de̊Yd2# Gc SA'!e3)&tʁMw;.!ӖEUdmH2A_K^!' 02z 0:P_MHMN5jUF:qݵU ݩT~8{-2^Xn$ K>s'j)+e1܇lKw+E2:ǘgF:L̂J3T]*!t:Eŕ()2 %cg<,7k){QF/ ߆,3f-gLa66|A>ĶNZ(* 'I 1F03_M(6I  #21E`{PĦP:(3iQg2&*YЦc:Kl?PvgԱKmK^D:!*! ,A*57[2<u:Y*fK 9D$k+m#9$@F2&." _0%?V(Zd)MqFa]O#%HdqdoDdT>lFp EU1FljDX#^# k]6,m&A$ `AF_X'xb`xmx)uؘ huRgT=dSb$$$ Z/WiE1W'r]u6MbhC{xdz#AB! )z_'?266ɱCˆ!v1P׋[ыqdžpw>W-k" g X慠!R*y9dMfXYЏ}Dp j3H\1[eH }cuVZjwb%T!(/J(o1ZN dJ4 jRۼkmb!Rfb"i $%D(DNҺ٧9{ǹZe Kv 1`]mOGTd^z]gr1A*U&i7T+`ej֊]_Zt_ Redi6-&[H &tG=jM!I($:4L *t{| hnEES&h^ج3@ 4 XK%uۺDݖe F~E~u9S4Ӛi4zi`0wgͿGG5Jm,rv}W\< |מAہ"5sA: +Ȏ$F%FN$F,sPL%ibLcɦ+ pDzVu󝃓g+ pw{iЍ;M]>bhr[#˸,``F^Ƥ]fhfcQ \r)˨jzR҆e$ S0) tN),T%-"ȹ_o^eMnW׼vL1+[+zdLpnrLaor\%Y-g> <Ԏ<"`'gO8d5N GEU3Rb;;:Q_dcKXO^1 P#/W7 mtnlbT 'iKRV۾Iz>ҝ7>_{`3zoZ\H홐r볲oedw6 ,k}tФ+k\ M!.8 eb :> gw% v( v" |hm.2 N:ԲwT"Sb%2F,KͳgO@!'iҭ/릱wC ߟ?]"P$P NՅpa:R:d>fai>F/4k^N=^0C XncloݸǮσ] ¨I<#fxU|L}7_'ꌱ\ afXF&Į(Rh*vv!N/(OM?CWIQ{/eH(FrU> os0eR=]IJZЋUas=AKft/nSڻTR1wr `)nUrۗ­J6^Un}[TYް!qwlbv^XEݡVR(ǯ(#(RO]¥KQ쨙ĀgPl+фL[ Ѓ̻Ef;7N<MF+|_>cyjIx2)[Xj5M uW&-A"Vg» 鬝4/& |ӥiepM /F`yW릩ZǷn:c.Q!ɚvxSS&QQ=$sT)OhT*eV7 M9ՓV k6!$Kn3]4h< Р4ϡ]39ҹ PF獶J ,e*eSJ @vl%j4B2#PY'TJ)b<_ Oh5ulm9{ |3]dryE)0d)O(l!>-2hcs.b T/m""L9uvɮ_3TLc[ }xw.kXh ^E%T .$GdKx+gKO"8dS鰍~AI5)qiiS) :CkXS@7u$jܢ {8ώ}XCGt}AFh":5vB̥Y\52bHWW?.u︬ٮb~Zec{Zi)b]uZ(< ?ϿלsN/,gaBtzI7:|T{^Ƀ!:.Bf AhLYBGg/ԤqT e1Ql 7nkxa)Y)fG%pF|DQ.@ElGE|SfUlMpQwl3l$ԐkRn.ja=7#fmm9xD 6JW(P:P$`"::l@'1s >]9;8m3{ p +a+MSz%I'T?)R73N)lJga޴t̏y>sz؊wޥquQa]t%XHԢo}[[_.Byh&KʠU#"/. 0\I[1ky ^~W6vNu.gK[vGONֆB.cf &}$2~$NPJdMxb5N S xb[^r=Ĉ,kG9ֺʒ6crJt!gJŐc'tQ=trS}LW@@`!$u *STTL`sSF&x"Yй a1i9$k>$G L6+%uT+r6̽ZnndUݼʑVYK)IuI$emҲRJEc. _|. /'rx3WC6'5eta<ü:KP|` ұDPdA MA;A㔡qhDô|<zCXZ\qVk] H$bA#&6$ddM'F2M\VX'GI]TyJ$kPW-? RcA$;XT<0gC?W{FxgGdgojX`aZ @Ή5eØד0Lb1࿳? ,LcO/O , J`LפV 3J :8G N&tO V4 oE`*)IH"X4. vr-1?9IdU.ѴFJy3A4Rw> T̟?ٿ-~ZHٍ%5JF|^ 'Ag%ۭMOsm1:wJ{o4?{0;i_#?N|wyq<Ͳs%1JH+9OFexrt<{:\?ۥ@ˋi8zC Xw%^ W:]w22eA4Ou?%zr}͓ӓ;nN-}WU.nrUޫNj*-d^0c|G_NXA9ϪTcCC-Ŀgv2cf?w?o{~w?}'o޽;:'pAv!A h(}_\o~yg+]WOfїK+mrIZ#CmAbdFUIA)@ѐdHJY[1-|PZN=m"C/SB&" Tჰ" E*;Sb[ SH2ՆKʰrjiW$I9E ÙUcL Q+b飤9: XY,շ׊P&dyX Cyqkk 5a8._rPS⿵SHJJdJYi ZY{DiPeʹ~ɻ#R^YB= @QkfwM\W:\Fx :iUYFR>d#zr1Hi 和 # wSA37GUk6:<0~İ81W˴|,cWyeJ_W: J^򼏲Cٴ ZvV7l`x@;fxwF=7׋iP!Q$?\/}ד+) >Ut:~݋DuָǶ$9D~t?~i\30q'ckXk,_cVڐu}j.5`Q KP|̦FOl:X˜XIF#K&sccEm&~Eͽ罻gX?%ױfXjT{$K?kT]6_n+ɇޓr=;^[?RϠ=sImnj;*Ir&Z$eB5 ڂ3ZdOsRLv_4E]I 4LEKŦ'Gh\*T L HB(lZ[f췌J3[lfL}G=^6`M_|ufᇿ< [lW T$UF-I ˢQP^HJh"U5ںh39+g\Dtb7gŦyڠcfDZV5n٩6)˖E ټOD W y&k$K!>E m c]`hx`2 %0:;5UYa s&bL =l&~{ؓuE5cDZ,dwZXR%iPP(r˜}&  "̢E(gZbɁ*Pm12VtpغUI3q[/W_5]\Ǵlk}c'8ŝ}/MɁ) ck)GF :d 1pJV`dObb͎#f2~ { L؞eWF < 7=.(`׉~t mG?!hѹ?\PR8WL퀲d(vb u|O#qڡG>Q*$Uz#U\v`ZL4XiS66YF u ^'ڄ}ȉm!Wg-PXJGX?:8s+礪Sv=>%B ]mȰ}ѣug:yZ1AW7jΨsjtN9E9J)9vNe(eoΈk-/Jy]gVZmthD0>ُ H)ji%]Qn*r' inN *sku_RЃW3Eh 5uU㮖aal{{} Sݷ?ͱMF~rl*.XS:SBMor12r5 ,TLf%j5RI+H&{NVmMRL R`˩iiTU鼳NC*=n7JzBOƻ ?\ZoRﵽ|qZ e^3i(C.15ƱNd\٥hcAE&qM583B f/>Ȑ5Āʒ F%R&ZXJH_Yo@ooʀoկɯ+yN |%f 9)CUEf\}jfLmXNWM`2E _< 21ٓf\Qh4yI7[2f)RK$l BWeG'= 5C{ٟ 1{˳ bqs bi{(ýBnu*v>9@,vōo镴wqq>;el: 8yt#Qx$kS)w2bɜ/Z ;erݒ^ⴇDA ֽx/ݧku\tá`uzX=s9ED-  TC&rL5@GfE'y%PpS&QEX7jS՝|ǵ)[j ˘9?b^/r-cgDf-:22uEeՁ."`B0}J˹穯87rYIYRQ(z᳐AMفOOV`)B,h7 RV_Jښgo -T\e6et3/hR:AWT2a?#el,g㽬*J 8y/ߡ_d ro/!fb9莴 B9I;S"NUNLw֑%%k] ƧZP@.-2I%ro6vdE kޮ}4߷]Oyͼ[P$VhgxT\@AR.bRG*l~')/{`dc|;mǴ܎ANGEg tkK`2@|@&8 LGCG 'MvZNAp y.B֠_ )(`2E˧+ŕlPnRw2+#4"ǃ 8>Cy(bmAȽURN_ܔWʯۻ8Nh:ҧCGFk>/UmMWYbh'.?)_L>пi3 oddl)0Fm95qv `Ll(I^5)Sb17&]‘/ 3_%P3"/WK/#2[j&~Wmhۣ5EvuɃ]vױ5eƭ7~Z[rWÿr _ArvMR9Wǁ٢T3Yy*$7Mg*f1+#^AҼzŤ4w4xҳKff][o9+B^vvb ff::%DzxOud[.-K;L,7EɪWլ*QKRV%1#2ϝ6R(̑ Wt&yh6Uc 0ȉf,T(* 3٤RP;+YQjYͨeJBVE^DVވ,&'@fP!fL6\"6&Es^AO(HC.1*!5VeP > BƠVvRk'Mwa~9GЀC1: fgdODD::gtjG oG#c/T C ;{mcE=uu~Q-6[Vq1IEЂxI( d`gfjh㜹A+myDThCXdS $i a-iEv>cri 8 rm)R+͉Uapqs(l*aP/ z=,kp~ \GԤT&Q̇fh,Rr er!l&ZuL##~#ףVDFG>bhFFolmPUW]z;C׏~}z/Y\2jk::J=bi~ @xj^h>/,]Uߣd~ɌH^ĄQtZO.TïM?Nm~ ?w/.D@fD=ˏo_NT(^!~E;@Fj;Uv;7-"N6y[m)cnN6y ͬB %u1W%QoJ;npNwYF:^%l奷$|@:-Zγ+B<>HooNզxmS>u9۵_ņ/^vum_feM38Z^)hՎj y="o yQ)ϵ[8@g.;Kh%Ќ4+>r,~7>8Z@%HP)2p`Vap٧(Q &aMr|ʁmM=0t}BGskXcj⊳لU}s+-l'Zqnۄwy.ҙ Cej+' ivˉX6Yc4ll¾әmDˋHg :d}ڪSr'_~%9xKkL!D,Dγ~V7xo'?}}A?‼-Bݭh?>>Vqk-~4>h`QW蹩b ZYbeo:rSZEX*%&thأ%TBkL iMJhp&byAHa.%N/ $(3x!dR1|OhYĦ3j4EΊϟyIgl|uk:|J:IVȨ@OiIu2* H<@Nٲ̤ćb$HB: !A;hf h',oP1(52d'l,ik4$"9,Q=^ [Wk"=K4zPRz */ ǜ8Q !2kɂHEL.{ɝrF 0Nu[ P۟hZEՎyǓ!H'^lĴJy__E1ys 5,H89"Hkߠ\lRgO=.{'=^Z@G59KW@2&5}`ޅϥQ oofϫ.ɓT痨yC ;wx ep vr"0IjlQ,9$zcɻĬRMz\+Lɍ7 p:wZ(h>NWׁ)ju?΅xheN>V{\1J08Y|\ OJhIMnz;ΊOt[(Usꏝ9[wF͋>\?xw}i>1Sbn?gFk+8]_շ4rR-2h$m z01QeI>n& ={޽Z29ӻ[>dF]8W%42$ p=$;#`tK+SE<ǨnNz?xw_~ON9wtV*u`zk ՇFljho20xwifֵ ӳf(!>v~CK3yV )/anOՏ8 QNl>v¸E 2؋1<&7b;VyU! Xt oiM<]]VT5ȹpciQ70v`tыf@8LBY h"3FȠ-m ?:\(g/ۨ)y <7J@G 3*@YIYQ*(]ҩ-uOh,5M~7X"6;(4CL3STIVH2`jʽd/y uP7gRcdWQGΤ1`{j?J+20 +Ra+6T"fNj*9;dEDx(:ҲYexlJb*i,s/mJ]ȑ3a Bf/*AXL 6 rDjhu+֚TVm#6FΊ8XP )5_]{[˸˛ib7Npk:|q kސ|^bub}dcǧ].>= |N5.\B˻.~6"|25}_ZW˻tYw8s.[{=62&lxp9,otvΫWs ކH|"'3%)kN?iCty,m̄߆'ZkϙZ|O)ɗSȕXj5;|9J/{rhcϗS[F( |DwMj[EYY.c΅@}Sl}SSSd(YTc1It>dflTUQ|6;t}2i|!>C}`F9tʖQܰO1rVH:ˇOפIκmX\[#}OOmU^bn'(;,iBIׅ`iƼ.>A܆B9tD%gК/мFEH]-?uEjƎE]j8tuUՕ$a:`G *juU5Xԕk-Tr֪Fn (F]r]jw}l 5_*8"uEk8uUXUvWWJeZuՕ ]`!W *ʃ7k*[u|x jj-=_uG]G%RWV]mVWWi?CI%y㏯:no_n0|enVp ~_z9VXM8NIJJT*D%E`bFT6ˀn+Q=H>JTHF LǜJ 6{c*3'UlC2ZUHhL&^eJ9ah'*$!xJ 4*_?Q9JՊG!X"΃V{Y2՜j! Jnc[V^KϥJDQKR(/2ϝ6R(̑ Wt&y0ʙc 0ȉ,94`&L Kd6J@hX#gE9;zKJVpE^Df<* d,l8 T* ׺hEI\W{##J`z y@h@G'#NPH$))_ybTh35sdq\@ˠ<"D[)J_K_c,ɆoZcqf4Dlr"F;1 m`do6rTc ƦA:MC5,%:A` 8q75&ӌϠ\O $JɅ6?ѲDf~/p&n4dSbםiN^nT>x,]Uߣ&nP*!YUYs]۪=ݷi Me}n[5w˒NOa[S_cvxσ4Ҥ lJ07jMĽsh,qdzI#O[3QĀ!\].yB?wnm\kS4Pۧ\Qr~!%t:ߦE }hG{Gsv*O힮<񢹽y.iz洼jSf}[8hG-ȹUތoLJǜJZDEڗWH&AeڮȦe v|gf0o͋R-եwd_3XT@"*2cʂK.@PU3"q 8].*xA3vhЬ^Ɯ6fy/nʟ5~78Z@%HP)2p`޵ۿBK6Fx׋ݻX'ȗB?%ڋ[=3PՔ({][4kzϜꮪp+Dpy~;M^\݉|ŭt&SLzńpy/?CuDJmUι@'"&F4?:v!jLݧ ]Y" `L}oMhLV|@4(bsE$і< !D =I ܫb6[>_pM.;]tqSZ,6䅉E)寅,ƨ+ᩩ"xbR,񈭃UoJ8M/p+7m$,76H/rS<8怹 pi 0hl4Q>p>PD"wϹI-t|)u66v/2esr|XIt}PJ X.c>B;mޕʨȩK,IDJvx_5R|*gHn%:XSDJtNPD2M'(. B{g^(XT>3m\)fO֕ɺsbtcTDZ ~TqbEhv6j].ukuӫ[¿]q]Q4vOmA?L!>!~A^?t0i¾nX!TM67axvwx?(3GㇷBNcPh~}L=B[]6ZR]c7ÜVva O{;ga_.ݐ sA/qIHqWPs4Ӧp6YAϢ5'n&u85uy7n}%9>u6tـ79[S$ k ; 2f `kc$G0G,ΆO}/Fj0O/}<$rYҴ{} M"dxjwK~}@;J_|Zee]Ĩe5{-ټY/.snZfj[L֋R?tuM|֡?pg]?jhn.vޒ|9a$括dFջ{'٢YW&7)%;q,T.Msq.%jEؗSbo=Xz|sI-)_U9f.˖^{<iB wPIU翦G'ȘDÐ)%'wٖo}(r& HX%4zϬIQttNYZhLY@5 1 D0KmԚ3v%T tVzGsؽ |8٬[Znz_tʩa|'b0"Z_Eb ;tR"Z@ȝ LCO'!8 NaEH#-z *'qi9'=?8[ ~߄ة}HC71g? .uM@wF鶺.bn"~ll1Ε&8oȥכځꔭD7ӿk: %Wvȕ?`R f!Qꑮ O:v%hEזEmU !nG<E7zOJ1#"$qv4QKlcLX;?}103y';Vu=WCtIN޺p.tagEwmf/R;h j,am%f0k̤Ϯ]n`)̲g%Jl+7ۮܴ#xsU~}~{9Aq }J()@GI!@ dzj<8HV U9װʉ.C2mL3ou~۟:?[R>K#Y#(MqJQ[K=냋tY// { e{mx2&E 2 $ To4c̔2yDJ"@8 cE'*O g@d24Ё'f\r𢼌^kbBZL'*;i)LoV/l륪^ڗ b?Jm.oTMA-I&0a DWBKμ^!_>@d,e Ec%`b!@uLֳ !*x.!HFb܏J1,,f=~Xx휆yf˜ K.ga>u~Ӡrhe+eJ"HB&6 BKs0pÓaΞpg6A I3uBߎ J Fb܏~2+.澠v1ua=j vųȬ\V65H٬6搮ޢiJxeOMMRi>D^"3A# G 3A!qDIFЩ\a<,xX;( c[D4#X t*HUNf^ ,w& %3 1#)PBڤ4Q 2@Q4;ME`>>G:#Gg}Z,f%-Y<"T&p"Θ\}#~,DHBQ"BĀwŬcK<\M/ /b:p]\Lя1)яmQ̣&*@j&> ^ a2ߛ|6Ǻ~ WM7r9n~%GOqZˏ?J[Nn]8-؅9΋Z3E8jV yVӁk,A_u_Om ~P!d=]3mz=mb rX]ז,coZ׃G754_/@`W w?x`Oat倹Z51WU|XXB.vz< UwX.8&XNOw:˿Ƈ\r'm[Sp_3ߑZrm%au OJn-֏I/>9d,K'E*,j6XHcրXBѨMxk\\D&"hi"@_8ٲ4zi3)+$OvȞJ[wCaoWӭ_kL՚4hTgiZU@L ]RHdJYUj@'s Km`u4[mMr=)ZcKnS%O~28/: `0Nk& 5eӭ)&uֳa6^QX]"δNʐh+ay=$Nb,mXaO2cȰG܊<~-V+wz9h)hVYQ\r'giWjgU!P%UWvTս|BVu۠v<ﷴ|W { ۇGQa$r$M]R%+L^K! ev (mCeʐSmxaYcNqׇاopqnk .L&Bn0N&`L Dv-;Tw?)'D.}drӒ]tVֿ9# ӰL 8"V̕ jtpEm r6ש98FEr 3O 8$@ާ&xI{ I*nl/khcrbi;fˉ22u:Rc4Pڤ3`Y^>=;-/*Z3JD./zs-! +!1t*h ]!Z*jOWіNjm(o]!\FCW=]!J.HKW'HWP!U K w!{t.,SS(kJ`It9W LS*tQ* ҕ2 ʀl ]e1JE;]eLtutIt7(v h?3Qj՛+ʦPŎj 8v"fpfh59]mԋt%ZںA]ݎ~p/;o4s|PG"ΠX/1/0Selb֝*&͞+v}Î:oYtןEfAew bU~.-G_> 3ԽM_Zuiþ^c7?헿xƝe)8kdalHE~-Φڡe5!-[En_u ĺv^JHkXk QIKZE-g8E.;5'Sm3k5k֒۞(F4΀ oM Mq3ZjbgL. ،Ex*1t*hM#TKW'HWhz] 3CW4Mt(9-] ]%Z652\h;]e)ҕ8P50=M)tѲǮ2JƮN$obn4'vp5iL6[FX*䤥+EJ*V1tjh=NW 6z"fp呗 F(lK-Jn@Wm2U g7{2^w(9k uQAt*ᚦcz NW%U-] ]qfb +p)m ]hkOWe=l thBB4G]ekۉ2cJIxKW'HWsʚDW1+ˎafh;]e l-]]I.fUsf3\ݘeDԝ2J:HWK(g0V1tjh ]et(upt&5ذc39ncUF w3J[:A2hADWؐ8.#M˺UF m"ۡ+ʦWhj3ء# n)a#n8 Jtu+ Kkژ6VCzoi^Ť2FZ4pC Y"m&n/a+/|پTzp5Ced(KD*괥Q{p>9X_"QHw+<d% m҅L[۬I5'>|هB, \ D4%02Z@Fi8C*Mc2LBc*UX m-z5͡ 52ZY{(5i z U|~2c oRRw(oJ0EL`o]йQ6K#.^ !J[؛()MQGO! x2fGo{Czeű+` n:o?_.=r:q$,ig3$˳&~҉NՎ- 0}7Ѡ)5̎FfsFq4s(N6c(+T]Y"aw9Lhdz--mVYB3~J÷h\M8-ġvL$׋GnO`W|8ÇHZcwey䏘H]D Ǚ-$%L᷈}!Np@?$swY*d9~m-uE ;pg!_mozd~G \W>V7=hm\ #/>Ӝuat~%?ˋ  3!QR9i.H&Dc$ʢ&(#D95v?گ3DXxw17['ѕ_RKH!5A)΄BgRXfqT9* x,P2Nn. 'dOM&挺}Km~I3[Tkw񟋧_IrΫ\[pɄ㟗V ~I,)Vy >>δ6j_,|kɠ"2g/i[Їg-ޤK5x23Uz٩6o}s8sw/t5ߌhκQ_܎d/4M_;뱣\v4C+㗛.֦5&|]aYStªhEM|YQ]0xO%@{oЎË xǩ+RaF(\Զp('A8em#"D\9Gpz癧EyEJJQf@鄲e)Rg-ךROmZ ~Θ *'-NBdiWF 1!I,bA%W8XU^`J͒S*)Y-Gp'f:::@Hs՘ Wf^xٌҴ3'93+Aq؛3+y;tyvZW;|Ӟ umSsєj ;'6U;ˏy}Sly\ƕunXE Ri2|ޅCNoi_~g岚GyLY5>|pmIKؿa VP{o,:*}G;*h[6h^p%١b5/Enff;`G1FŻys`Q(ch@mg#L kUĥށbvOp>yE\ɫ(wx6|j7c)'9/b`Q1٩_G%3kٗ!6Ԟ܊9anO2x;F&KkVvD3Lwf+.た$.fgOpŵV3@ G0xdNe"3oש$drl~~M|´8-/u$Q}iz=(#\tfrw2Mġ "D$"4Ĥ##8.L0Q۲#ŗu$E)1(8AA@_׆sĹ%2R2̋ h߁?ah 'LI`C"MƠID\'g,dMhy$5NE[n5mWQf,ChmrɜFǮ1UR 94M(eAتvYD%aI["B&2' vLEV%j]wUgG;^-REQrghr4s2ɛ:KkKNmxf$Ti]nL1ٛ@MQY s ɵ|V2<8T Q ,VCjP+zka.i΁Xb($Pɜ 2+L*h5C[;*SetY, .tJNH!D L*V()l Ѵ4L<6Ι譭^Dd+iszѳ_5Te[en{#=Dz>d8^h-θh0椱7zcM^t:iȾ֡+[2F?: Qm,| (-(i'˄>䑒MQ \ninRJ Q+YȐsY3˽!%Ew<%v*]IIS"% ,\n'TD}Y H J{+]n$"3҈# YN2# g`*I88e.d4}aN@ )2]"0YC6!d2/43G!xg2,} Ad)bd[ W֥L Rט\ [*DRy (h6C!ml %2e#0#0FN`2qS6m={U_86f1dcfM] {i0imUnfCOaruƦ8`#G_~;n@[-g7p#"R=f^czqؾ蛏ÿ?URb?g]TJ x1 b trd{筪#λSFBpA]ĬKUY0AU.gW"亜OqnF yH\EGΜ@JޅJ)ڃN FYʡjp/w:V!Ssv&C $<ؐUCl Oڼ `gn>^M!ѪSH2F / b#S KMS{  jOhL%σr)dCmd̉3i4XwoAZea ;ːq"ɼH3f lz Sctqu꾫vcBm d6{zc`Bln:$1 l4E[m /[^»w=9=kܘRI UYmH9mH `T/8ECj>i[osp8K8hg6!!.DNAq=eM71y%_;Oh.7Tʚ9`*JDc+LCdJ倥L-{0D9k"l%}x\IX##1B*iɽR0XPÎwӢxS8w¾jz׾guī{^ׁa 1qCXÃQQ w 7C^j0JeP%,yE\UϨ)U&\ƒbRvA$ `\KϹKʞ8;챫Uj^Y9;/xCгY<6AGɧc;(WBUe%5eg4xe=>[Z=!)۔D 0.L܎Y8s̢=v5qna2Nk3{jiǾ^Tڦ=ݦ" bd˺$osK,ZI͹r39Xs$9=.+m6<_IOBr}Mhz5psL:հb$ e hj}]> f0hq.o|Q˫9;gs{D7|`| 2^r[qp19D=N>_P㿟gW5x6SJrȧ7BfvΌɪ!oeD`f~3u6Sm֝IXS\dQDXhTT2#&T٫RAʊI#nK|c"ƑD\g$E,B0%. "y-qnuݥ-w80- gMrZ{$Z_V >zFH؀vPb._ 5p"{|4VA]ը<Jir&,T `(Zʵn':?{׶H\Ea_v=6/O"-Ւ@o$I)JRl Y"Nʊ ֯ËC^ ]&u/i۳!}[l].yfQ\l!AWBRӭbVuސzq g7}z},Y>գ)':rRϏp\_J]>xwttۘ%wЭ_ݛ=p~|yG{dFӝvL_"hSx]֟;:p8&8ǹ|W`g>aˍR1 ]2w'4~z$z35~釘O57~]淯oIV0s.LEJǮ|*z OoZi܃=CUbJh٪ܫeW+XAWov&H2fKfГkTDzK6MDUKQlYbb]/~ۧuF:#"/{yq2onWpe?](#2/>m I.2*:f R3v5r;UQU_2cC$n6}Yנ5urZ:5FZ{c\fT-_  | WnlCn>;X^i]\|>ߨ~ne]E(`OxWF:Q,gffF!Y)6iY}ZVߗeuG~/ՍDI̥K.x{=: ͪ"Uo\k\Q5ռ TR4|ZVٗNK!Ӽ v;>XE5jBg>F˒;5[WeM'AR:Ԡ J!_* [Hְ>OQ-0yDo48;:B{~gޱ)P=@ϳ WviE92W.orW]uy.+FS["Nf4QzB"RFUԷh N/>u;zV/6/Х2 皍 l[RB6d{;AVS59.=4ڈ\˗AiM_izXǯ\Ʒe~II+|d쌍wlkJDKEbLY&FxF趼6&bxb(m.C d_]z1Œ b{@$:TZI(穤g|xwPdp>v^@[Ophk?xTC!*Y۴PU}zA C":UjCXy9ޖ>C8t1'38@ GKcxwACip bJ)!RX2[H>.VJN -JQM4tߥQhv}||q>%Oc/զ xe8n1_]̿)z=-=-,:r̢?}Wׇ妖yx9 ux7Qo2~hH[!}oo=xzW~hHr/N;c6Ѓ s40e"˘NY6oh%6 }S/w-[|\n\[<4ֵo0旾EsCjNgԜRfi8ZO/l`=N8?GK#>:{+jV?jVfO/iކ^[t@NE77v?::;_<2c^J j`nzݯo`G=nޏ4 ǔw=bȪ oONtʯOɶ32FA1r^:яl/ ƹXVW2Rj- ;p)Қ-\K xZHg!}g;Ϸchf#<o;^iVeV7:ŷ2*$rQUZZ%sHo٢^}ܗƎ=7_A'sj! ~η'MoJқlF poxG+fwͧoo01DW0UGPBOt*+:\:Z NW~-oS^t$d+Zy]zZg_,'ЕLt=Xz`9 pm8hw(YOtx'8(w0tj>hޫh7HW C+vp3BW@_\OWeHW+6j(tޫ+$]Arb=v`7ЕR;]VztŠXáWCָ} ]yV:W֪̓}DWo$(iȕ5d}饁UGI[ Z9 ֱ p:uњwD&z t˝[ F'#t~̺W?}9=;W &ڦWwM(C7IgCgg9^R).?}E .nff939Q&㪪kHcュsgkY{sӷ\rmy{w+xvYI`qCG9awk¸'Cx,h@E~"ቹw?O!f5Yy?<5?g|f?69Y{ O)hHhqҪf%:]LҜUdYgv%j:ZG"^}5?~[%' pk뻃v^ߟ.k9non|vz^kٴ^Xݸһ|9!RΚ2%NNGX]0tqA۵$LTJH^c u֪Xeeq>TL.nõhj^yNdkg,.iud^{exLX+,ԈւdSɸAUmZt뛶R 6'Ãh1rׂ6dK9{n)bMPRz jU|eaV]!֛w،a kl.sؔMk%2eU%bT2=#a<}Wjd]< چdҩhkgІk"mNY2d Cht/}FG xIFë5-#y5 (C!W:BGf]F\Pv$q^]\o_/1kŕڃC'dHѱD!sŀOce ݽ*kjjĦki%9ꂻ 3HQ:wπجfsЮavcD(5% ~\Ϸ7e8eBHQŒ$XHȔXr !JPŸ4k;j\6g+3搌j1aV-3V8A 0jA1 F |.%醸0Q#^&| E3UC`S |BlQxiV TTPtB[g?h jvlg((G5H[u$kdunX"LՔ X8R]qaj PY^Og*) Ue[%0oaMʃJ,#RRmhaY8_w!kUtb&!ď a[*.7Wj`X:Mr5Ճ`&T2̂#I1G""U騛MP *[ rg5d)v\Eok(PB]: ߁ !6CA!SAp2PHSxP BHPX&TKDinԘeN"5%Ax`0M%BC\9L uH 0X rlh_ 3dܛd䔂D]")xufIyw"< eo_Ht[ ()%$]qB(^ u9kaոQUDI) CfyJ ՛J_J[РDt0P֌ 6B Z6 z° F񜑵 ҤV/ F L`_.ѭb]B_1k,4_#(*S^.NZ#'_p0g/fgץ{Ym:Qrzxwe&h^4Mւ'ck / TYd(yH4K*d`6uLA#`Lj ~HhTH삙Ԓ"4A"e$iyM!}$#]Kh\=ɋD0Acuksx ]ko9+?]6bvv,̤`Dh[L+ٲ҉XE$*IC\+.=YߪPsP5"VﬨFy¶",9e5@D`U0rw;̯;K3.`U$CeR֫XktdDPtGa>_=V$ 1P SG|À0 ^v@^I@)HʠvPJnu=m>%dB9 Aڱ ,0A%DEHmڽ!ݪf ECyp!(4g@ @$B2БwqAsBl$ fJ2RdA1A2 ~ȃ\ 8 wGzzڱhUAWEVeXDH{q1 !HƖDEl|?lMJ!j͢:+i$Ǯ&`Ÿ08iJ)2H<*HM w:|F4F.LiB)J2rXQl1w:Hm[rO҅U.E Gͩ ژ =~F+9VG=X(C%TL(ɘ+2-lPY; t5890Ah5›-ւ Lu({Z kLqT@: Z6 i0^ 2)Zč#{s|dTUܓ(d /U\ВFn5"~ކ`Vc6g . !C3F^xЬBF҃Ctbm z atVBZ]шw}$?t0H5ozFon/nX\mO':Ddr *ww!z$-gaOvo~ߧ.d:/J@a(nu1c˽t0^~azxspU l)6[sn86N D}'P|( /?@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N;H'P|S6s'8ngw|&8rD93'Pd'Г@H:d';^ Z{uԦ9'Wv@b'; N v@b'; N v@b'; N v@b'; N v@b'; N=tQh ZuZ= L Nh@ʰ@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N -9n R7Z5}'PjN}ta@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; Nq}^>bkW?0TfCoܹrwg@ʴukӎq pmƸdhK@ٸƥ7^_NݵX6e~24xy=/Gqȳ飯TXf,_^jDr,Ѫ.Og+گ\S(g?P$P{{;^ zAT^.NmQ@BU4C3 ڽ{[~vo[9m~)P'avS7nSj!׎L|spnTm׾]7De a,ݠޚ.DFbCq#?4VC59 "vHJ#Rdz_Znە?u}gRt]Le|1Y-+7Z1Zgݜ}q6!8 ]qRbzt!N7bਚIB4sJ0O L8ޟf w ROXOdG*e+Y BW@:]J)^DW؈fj uEh:]Jttem4DWf: ]06Ȇ ժ"fsWrc Еw%z ]\[+B;}uE(#oG Z+lڙj'ִBWv^B^輕-md 64CWuEh%?]iH84ΏK[ W։9;(]ixZ})M/==ݻ!(-"ڴBWjJaknziNWrb&+]-%o+kt+tEh:]J/ 9[+z񾫑h衾.6DWnnlF]R_{SDUۧOWɴE"+BPncQ(Ҿ+I nhm<]x 9Ut%e6f+}`Ǹr S-̳x``fcݔqJnVBٲ2i qfܩ}5.Ns\8N'x`ǘJL`uwqh3ZGTz#Rl)S^F=q];tEp}lm<]ejJEku"LR ttiM6DW؅fඣ (v1]]Y! "+B (Զ0]]9a 6+k]+tEhSĘ^pvD3tEplV 7u"1]!]oi#{)++B' HWQ [Z$"^BW6%]qJ=Ց*N z8qt5bGơwFЕbzjKi_vtRf ]ZNWR=+%!"5CW5CW6nt?te!4DW,UhntE(}+-+d3tEpivsDP*VWHWVT2Hj=ꡇPZttd5"n+rjڙ^+J7CWW7">lCtEGavV lF]od j*jlKt^MfV <]ʩt ҏlz}$tvϻ8F5bt58FNLWOnz74FkZ<#ylCM洇q%rG?G],R ] I#91_΍9:Zy!S "~8_{PttuQ7DWn CaC-ߨUCtEw~*8>BWk^ϧ+cژ6 i'$VBW˩xѕft[jve;.rp..),!\{ux>]\-,ϻpUNrT!2d-\jTJ>,Q],NOű<=#Ν޸,}.?ě|\O zO=>Wh.ݪw╬A0 //ju*^U'+Aet s5sogQ"LÃ忯UV tA1ᵄY^Am3%]՛g)CUzbٗ>C/fnܴT:])7y|ur=+tR.:7t{%l'dP5~^Nk^]"kj&7o}Lk #b}⺮ߣbEYerUv;ٿj -k"Xk2O2ӛF_3*nT76x(Z)cR`${1=$RҸ” l?ӇbcڕxϻHtt.%e*J7v0u2AQnty2 E,BJ(PPP{Fl|S_ +ͥ@-QDǟWsH 7_/o+g7.|Zwo],ƍ;-ج??+|ÇMy2#‘THV,jOLfo}2;~r,֜-*A#0^p+3{2\Cy6Ͻ\L_(0K<ܤ>vin gZl\_4@!ۭ]/CcI`Yzc/+nGRo爰y:]/.j999Rڿ[,~>Xv 2A=]}8ZAw{y-N^)qwWWYyє6'A,^IqI7:ܼz1>zq5fzڼtľZdWWjC"/9HFBO/޵q$0; # =qB?-)RKJrW=|5Flq5=]UW]]Ck,|C&Ea k$#m+PflI#Г"hUFVP}-8^>QKʺ"S2R5YȕIρAso38ƉYc&-gY0^ڟ5HF+˒!H3TpZޮPsG8 PJIX ɉՎtԬ"X`= 0cyp#Xs 1T2 qA /ܯ, -JogcEĥ|D3gM|N^4"؜ըzOu& f#Y޿Y ,4*0uLzJ{+Px0=(*d ,IIQ'(s/o'%oe< ^䊌 ] Che L62pkq-MH?i:+WeYɍbCN?m׆d6K4o-קWU9Éq{z5ĽZѐZkBp|v1޽{M0#Z~^yW_|ep0ka.a?OV.h .|.&w7M#MI#]lFla{RĄQmh,O>n=ѿrsb陮*Q7iԍj8 1u iy4\O~^+wΎNҩ5igr//i9׿?w/?{~}wϸpgowߒFr 'D>Z_%0ೡ==4|hk]^PRo$[R6a~!| S/ͿBӲfb'8bӄ,Yl~&Ćq:NEM# qˊOGڃ#GkxfgDI(ic_9QL`cP"8cd$0S;C6N Λa cQzܺTƨC˳KeOr%05k<`V*X+5db"ePJ+h 먒HWZ+m[r9r,_E)g)S9_2ZCP"oFd"* 8.Tp+N8(%J` 'n +?7 g7+6MdfOg15h?п~]S|PAD1Zy"fn@:KCR:lp#EDE:mLsu< T : jaUX:-+2!W"8kT<˞($<8-dȴ0BI)~u@A6g/jg)زPrkȹH @ <,.F/с 2IPe0.CnmQ٩@*pN?6xJi |N+)/S `0F1B-pzftd!wP1OxJؼesE Mp,1gSF̊VNub8F G6[_ffqk& +i!beҩ=tBKb`v2 25Dȝ30i[sY=i5DU>ݰ-Fvݽ"Y蹓KCVYZGG[ŘN=S- 9T;La 8?\5&լɗRv2OHM)لlJC_N@o5ܜo:=Oצ<8_d{8s>fZmNZR?eYzpA iC k9dZp_+t;[ twE3D!aNzߦI }wթozgp*;IJ}=~{1M.[_O'x5Yqeǚ"wTBEWYH="@}ҿ_/'@tPmmRkm6;i=/zjׁwD^YaURa`ъp^aЌ j2۝/QG!o'u3Mo[ \RmFL(ȔA# }LI&d4)@[- _-_AO^ͥk0{y& #S8*@UO7|ETPRY Nb si3_4-?چhe*/,\ҏ;.$Q@C!, Yf3"ː,2+&O8 9y )Ercᤆٍ_.G~\QHrt=h1BH4UAauT>61/}W^s&MQ_{a/Fimvs`ZeT6q@mHR$];k B]-4y3d^[BeKP-L1b8D2q*|=UWe:0dcfQ%!^ )L8͕.f7zГLXtlj{];~%[p3#ᾶմ~6W77?5}?'cv[ cwb0eg«dYkdMYV(I;˖w«Wӽͫw~:Tb4} .O#?.%.g {\WyKYxEzQKf$(@tYoFf^Oof >3.!6tmΚNL'y1-̳P{mYziQ7uWoٕ5ڻ/wzoTݵj'%$h5nGㅨMlWտ~ӿK&id$+xXɔ9S%en V-bժڡ>."K<RȖl)%g .9jiٺ#c.8)(.C>J!6"L)0nL=7mfi-bև& /C۰92엪ތN h2j{}.}\EUݹGTYzztyU\] WȭMJ8.}mۻ.~t\(HE+j]i]M[os8[gZVԲ6|xw3GK-h{goqOI)?򱡕޲X۟7wy}Ԧ)caS|ړ1̭ %>S(p&ͳNWEE&мR>o'`*AWʳPD2 &2b_L%R~ޛ?!dSAF6Zkɍ&J&{Jn@,Eed/0Zs!QIJf 7YeE88b!X#)Eä1qvU+&fz`[zՄVhz |G!<`}  )Y]+ YC-V44Bh*W`TdLBEK6.l6.l+6.D9gL<m9$NdB+4z5:1R7V%i6ѳAGcr) 7g;/݈ƪ\>`ƢLHPjoY.21D]+Z´amJ̭ʄ RR3KFA9%RhR$)t$px9 K}l5&n%^٩w>\v,apQo6D7U9c7&v1yѪs[:L)O]gtG4SsA UJm3i?M3 w췓M\/_6Y2YDN-s}iҽu/o囘z$sIzgs)~R;w;.8Wv%8]³ӃaIcӹ԰CV_\pp nqzϒ_"57F=W|¢DWW!nu&k}KGʹ⚸CEsݫf+w@l(|+@aT7[.2Tߔ,*5ӳ@qG_y4]7Ӂt5.g_8X.!*R픶j4hQz,b9Md_NScxܩkX#EV*UhJ'/Ni>EQK (I:b9`d6D؎F:FqƩt$UX(GMl%ςc()Y H2"s,SΤH^+3*(2c,C/gF67f?\\$?!BE\8x]6#R(BvJCi'-^>Y4ϹB8jє8D[T8&CÑg8CK%pf%Kܠ,$@8 gKBd(u/IB ģ '{(L]?W= lG{Gg~T2D:˸Z4$A+S00 xdֲ/H!U]sg]DϊO:Hˆh!^tP)lP}b{ aњ!jd_+T=CjDx_YVg>lIˢ(ơhjc_h+kD;jQ#nڤI\L{ Eθ[[mo杮t|MEUy>V9ѡw/$2kDiJKAS/}R.萍 <'NydfCŻxtO]XorZ]?H{ ߎ h b1Rjd#+ȍ0 '*$.7&8p4c=\ M6|x QDn55 &&JAIwOR?,_:P"g l>ф?Bĕ/|Zb]hBh{lvMKYA?ޝS1mns}:~u=YviUu\E,==;Wb|?, 5)mzuBmlwN?l<>ߊZ̬'0W̥3yJ;з|hF4nwϼ˜ߩu.ʃeD]{Eo_5ݞkS*1IY$Ej_m/Kzܼ]_g ΛӞ=D ƹȐ/8}b.kfWNs1[I!3[Yfؠ-|bIں,f|gV΄@|V pE%k(I&cg2 ́d:*tY".y,2C?|8i\#eG,h%U൳Pg;َkbvDlf}MztU5BO$g45'r_n؀˭ۮw<R0ezǠdRN AB!CyY*(C9DMTu0M{B="% h"o R* r`HK+#WgG!ȐV B*3(R+RV&Ì➣Hc.h^3`,f"I)k3.4+ڇPWkfȼ2:>rn0`)Ilvbš}v%o`Vں%몝4ѐ֘Őu%ڏJ ڀZ{چVqPEF/a6&79ƒnD~/ޱo Vm02 F9pufܿJٽrph ofq"MRG 8gk4JKdLcљg('7Cv$fM-wS?N3 w-z1 D2ųG?h/jӼ٥y%nk^0n{&w5(vuoNJ-S,~R7w;]pMWl/h6چ]NC{:v Bٹwiv;'m]onvA:UFcQ?DG|Zib>9?o"[߹鮆ƺ侥!ܲ⚸8B͛Qv"։|qrGKdv)p5-ߕ@55F-cZ*蘘QGR[wݒrin GӕOKv]vy#iuh1A;r!5UctMm.[,l!&Em;]5qo٩kXS jX zȾV(_MPd)ͧh jrVnN?|12-zv`8ƩJkx?p8^3ZKǒ[1Sr0ZdpE2)%Y @&A!W$0fߵ={l0#]_^ BیLJ) T6\ΧE8*O :ZPES026mRr G>Ñ?YBN,!"mVPO γ%s2N:̛ѓxt!d$~M^q5!lb*ٞ7JY3LeQu qœiHVR9``J1@$e_Bݚ9]3.Vg'aDHJZT 0+/jپP}Lt!hd_-UWRW=_TpUT FEmt܁P _` ̽`AˠKXN1 R=sUQ[S!0FyEaf+>|4&l#ʉBIzE$ZULZI`cY}RNTܳ ңg‡oYh`2!EMM3ɤ1tY42bWvqZS1IǾm*QgwOyeqVes.o_`UpsKB0C@3*j8(,KEG8R#a!eX$paԯ, 0 "V}VFD#bwZhmR$Su"01x@6&X$zBd)[[YΘ6@1*emLD.8RPF4iFz!kW@6xyl?~HjR/.ʸz\q.'ÔT@"gIHԞDC f!"RB qQpPt쉇5p޽ lKP"Ы#Z_kphwp͆܊+en~7ubiZau񇿯բ`CV_G,qAߙ_Ldz4`8v1EÀݬ |G_h?O.W?N>LK8*+AUwMq+ϥ0rI?ҘOLOUC۲n jBAU%=EP`UPU}\%!2XR" "%JJ`$Ä*{.뾴SԿG<&L6FuLTQwژE CkШjٮzB\CEiߑa֔uX뾆v떱B>zLI1ūSC2,iʡXr嚓AT`*#O9"sMP5YbY-,.tyO DIVǮ Xy,.b*&֖$wep ҚΙ+2Ypk%F_fOin^)f>]RIǃ9]W6Kңi[nn]~l10\;e췥딸e/=uu[ʽbwuIOPǫ1i:Y771^1+^ QqT1LPtOi.snS?L7'gqӗZ藖fN݀@RʍqI =-biJTnhzNCaghR1q/"RUWp,0J ܣoѩw9 Fa3zBBE#^m8W<Xd;^\{Qpw:ݧ[5g,I^"\yYBƏh^phNk!:& g ^C{OӱO2gI:0YX%f 8,u$ $%Y38gl7e[R N1vM>D}&~|y \l:*WmjZ B#yoS, dP4!0d<("7DzdQ%$'Y!Kk9 B9d,)f6I"a1[ $>P;GblgA_MĂ`͊S_|J2~ӥ}:GD,?%ں9yؑ`, ? ;JC2)&"Holr1H_i~OG5HKF&  Hg4ZA 'I߆sBpڴ RZF^#飱ޑt7G 9G뒤Hs\FoЩ;ҞR=`kPo=|~W@0+ٍe6^6A h΄OTԋ77'7 LOEbmv6+0ZX`L@`^Ė|%!Gl9GؒE7nJSZ_%jG 3sաLGՆ8E _?P20Yr- 7F&JsQ*Kp EB`]୷AJ:PpC2X1pv aW@2alhkm7pWD]UN 3%hrrXn.hjrz.E'UBk^q,qui:I^b]c)H5G:_u:Ŭ|n*I6u) Kei=",:Bg VzZ*+ޮh["=1.h>FbXw9e RR̕ F( @ Aq#~Ht/"KSmM,$J:)x2} f-.Y<35Bg!h sD+c4 Ȏ> 3iJ4t8 NQP(cUx[?`ϟ^Y\F(-L-ˆw+[|0lc) -/m`墲f>⿿*6ſ6δL/Ep$w eؿf%w{J%e1q9;:1[qlgah6'<;‘kVf$+GaX*+K.ۇWMiBw}")yɀM|41^yp[ Y쏫}_:pn$MIqw\ۭ^uA{i)7}MZnp㞖W3#ٚ~_=ty:9z0qͯs? y4<9]̭tsf("Qv1o}^p#0Փq=nDw7tv.,oi!&OZ,Xvxy5ѳ>g7'֞f&zWΊWktH|~LcibF"~̏`x FQM2Ec OI޾)?޼}υ}^Of`u A0!|mjuM6EUjiJw.>4NbyyU}R^\"% )5N lQd8e]PR ?dϛZiuì(ʛf m,(=ta64|Ũ0{;}K3kMfr:d /J NN' ;tzv+@vf[yq@j{ƶ;;)kr.=tIfZj0I&NsfłbGJ:D"J6>I8eeC=Y"}f侧"t9;PF`ufO2M‡DEhQHp\tΌ7AMxÏZd }C! 7 <8m$/t-Ysnse %2 Xv\_&[D &!sxY"dK@dkPUsgs3FĮ1Bݱw=5WWDEH3-PZ QNf 5䑒MFdCiWDE6 fH)}u.榨0U y4o=]ʮWeׯ7~Iv01<1ɶ"GBZ<i \pl5^'}yD $5|(xs1_OPT˨5H@PR䲌%@-LFe6H.vЌ4+aɘ+#e{*u;jˋؼ',p! 5Srd&_=h4#7S#q]95H 5TY*9 UA3,Y-nd*UNeI'sv]to&~~ ߬A !!]J35*%ZS#t2.N{|POEBqiL}6pv[ƣa(h, 6$o$sĸzاa +Dx}ةhM-oRllmO o{oEt6gc 3 LY ^$͘ AIhbA5S10;рV0<;!})^.p]\}{ ^O^moFĈJϡ׻>w{ܩܓK@]1|6yї4o% /̊#bQ"ǹR>kRFcHbvsgoﯴԢdyeZ#> gߪ9e2ƟJњd8\s這ɢ7T.|gίtoF9ӼS[xPn>gտZm>+u&{%ƿ6ʶ7=V*VժZOVq*VEZ*VUq*VժZWjU\*VժZWj۪ZWjU\Uq*VS77H6ьHZժZWjU\Uq*V} -DU\Uq*VժZWjU\Uq*4( /U\Uq*VժZWjZWjt钌gHS5:`k0FӀq]Ä`@de=wX:l]tQ'EM `Pʞep%hifI6%Qh]L qXcK%и(@zz)U1 |O,B]5gC9g3ǃ5B&[m& P)8@E+QrM)#L)Evxk$t%[/08亄fvf򘥎AL@(K4/l,"h <"9r*`@=p'w[l Uuue`_sΚneR2s(d \'GPigu^cЖqb)(_tP?ԴâF!3[%Q j:OZ#< )eLunib@iýuޱki/QϾ}׎D^,Hf5'`6*EտڢG;I\񉦐trRxwᙦ{%Hpݯ+?ޜ~~v]m`\yc_a?~E.ԆiOo?si[w'kjF Y61 GE=Xz|=GkN,wr]>lѥՕf>Rn|1:i%4~^}MҾcR+TMkRY8 Od߿}~<|C.ۿ=|uF=pBƴ ;7;5@;4ʭxt٭˷^aNKO==c>״%/gwq5v5,,fbB58j9Q HVCF n~._Ě} GB,ap{sp 8E4q]GpSZd)xj-(guIgmEYV4Nhc>D#eqDL,4ZBdYE']P䲰 'VWv:Y٩,l♨mS.=<"9LÄr4nOi-گd-Dm %X*YK񀡑1sV4P*%t_{.eVwî~ٲ-mDF0Y^a5ZZS"1(sLZ64bŴ"S18P,F4ָ]akr6Al};:wmQyYއ"N27AxVzR39䣦4 F[g 5/NFk]>zxMl~4̛>4'4YS_gE7 ^?oQ!:ğç#LV"̭rG@hED3`8>H"'80kfSSaǑGk"nL ݲ-/b wYi0-H- x[]Y$l>ʼJFp;49ewҮB]x-Upe1fe7(xQd wcǤ4QJqԖrU4%!mvr6A$0+S,g\F"͔Quگ>1|!di> =M@ MCVqPe N ahIV}&@[ |p2l]p`;6]' ~(C@P rp. ל#9;|rvN2\$BR֐Ht,A@U=AU33 ]ʼnXsgu#6XU@v uUt#8Pf$ F-sǡΚ2d\ޥ|to߭IY|=B%ud(n; ^c|Z?E39`THxՀVs:2\JF9k<tZs@a؆l7hujw9x`HJӞxVgg_iX/VzBƈA, 钆,e}I\ԑ)jŅ,*嫄0nY.C3di1s(,%S9uSs׍&yVLPB:l'.GWj#wQǥA*OT-_>Z u`x!BQB0Xqƣg0+{͈eFn74#Y,eYV hSRkve,})2ػFrW}J1/ lp&sy,x}F)Z,-YdÇ*~UXU0<\-$J~wSDbZہ-6Q>mIz3k2;`rȆ 9%R hX:(Iޡ"$Q( ܪjx"Tkzyk8)} ԭHid&"99uB(AT˳paݥ0ADLВ8{a^I_n3MDAK)FZRx*4ӪŴ'>=:-(24gҞݥt~懗qts6K'ӳeF pLϒ7Y=Wh9-RS3Е|%B}0wz5{t^u3?Ctr ?/'sAf3Jw|3&oaП~F=s Mb^f_nO>޾^ی_zW+RJzG"+]|RH%Dܹr_GKco[˜LM}C Mö̞́I`ˍ] ^r޿^NinF;'E4,l%*"}W;+˼Yo{_ 1mڀ~tic?fF{vZz aK0&]ځAdrkˬп%_h^F|x'$?T| ݤ/lfEos5˛;[?]#o;tjJmVu|kð?X/z}5@Ⱥv G^?~G/nf9y u1>htSgMyuCU.k(qvA Ł#=wBmr;gN(Fr~J8\rYE?=z .ؐrav<ӹ\O.:P>l5<<=hL[d^tmvLL9B:_F<ٲS6Z̓(]ݓ{md=ܧ`,Y܁T6x^rѿtΥdU!d&A'<)͜u?~Ĉ?  0sU )\ir/⥈#>2RăL*$}^@ΛNXL6vN$ w'ƅ'o^qG)x&x&+;p-Efu痣i3|;O2_HADɚ|d}c!8Ck3IIksBC@id ʰ+/*{6 +Lk~}B1Fw/k]oLEH "y*ӄ DB`B$@ G]"ʖ@@#sDQ$PIk4/҉"Vb٬%P5qZ'br* {Tʼ+s*CEQyuJx7 Hd6\9ɝ&|:CgS2JR9E@)C}B2 Xg;؄Fe,6- W*la1ɨofd]ܲ` ֠هleH !ZeFYV WQrOq-a֕e'ǐ=1βRy 1b&&10L>&.l gvQXr1Oj]*lU>%,0c 2WD!1$,5tMa甧PjRYvD'JЋ<*5di/8/ Iѩp6֩,a<XL>vE"V$BJ *PQzʜL>itV@NhtDx$'!HVє\(S<(`)eTGOVQv gE|U;9bŤdWh ESbOI<"5ThD, 挼qik@03S..=,&;*%a߳L؆CQ^1*_1qFhɎ[*؏ Ssh1%J%"pQ4QDcPVcL{i U'AuL;Ahe$PQ D1'^LrHVD4NC2ux΅І28"*$xZĜSS232K)69wBF HP0m=F鏑P'}-ٕ-bTDIBCKrg hЭHi'y^gD OLyo95({D6OuF{krQJҡ+y6fsJJMd3BDN H4bxBWGL~4hTgiZU@L j)v$2,*xCN_T}&t84R9XV[\.it*Ř[TIiFvl8 d+ubVj5CX Je52EKLqiXL#qaC װ IՙI >x%>L$& Q1m.@H4$ g^aVWկ?]u\C&~s+pZׂxu5'-tX0dJeчM#5#AUF  5fݡ7QE[xo AҞ` ^Do'>6BH@)M.^޹&$*Z3+},QhL_/D^?Rg-ךROmZhwItOZ#…ȬӮ S>Od.^J+z!H%Q`K9[ms7+bf7H@qFR~F ^'Ɖ`1X|#HFII JVs0pEM4PFwPT/ƚ?>ko|u`ٔ}&F/"QdBR ~I,)jrĪ;*oُn6FSOT{bߵ1Lrzbx:kmo^ȿɵޫ_4B;=)+ʊc5:8@D㢶C8 ZDi:m#"D\9Gpz癧EyMJJQf ^} g3FzD=!~66l7vLi1KN'2s6=<|J/߸8VtT-ёxGD/0c"N}vE{;[? .|jF`<;lc MfPKM+~y4/Cb^8pӣifu䳎`OWkK-i牝^x|ckqJEM]#|aλq>möioygb3̠8̠ ̠ ̠ ̠ ̠ :ޗ8ޗ$̠TfPeUfPeUfPeUtԡ#* e5FI5FMQSmT5FMQ!> h,W26腏_",{獑25jx㣢N RRN8%;`Jj<;&S={Y 'u91912CtJ8A3Qd"KvkSO&|. -GΞN /]]a5KQϻ*`T PIpzI MeSŽ "q`MRba3bb YbʥlpXudbA&ʘb;Bˮ0%AeR @gL(]z]rzWt.z;M"x+C3&c8G}[aWN-T֊=:'Dn&꜠{u h"Oո;ﭩ2 4ik ־EXWn8Q s(7o\vrQ <"o_}E |bIN.|`-U5j0惙s>u,L?|2<ixF/'dZ V֊sYBG(K_N<Ӣɼys3ʏo+?{{:8t0dE?dZ/&oskܮ9DԦ?^jr/xGTj۝c`vtD&gd:|'"_~ë?C~~!~_^Afs2P//'W?~kġn-[ =Z 1OYn{nnHۄK4״E/&bۄYB ̐/b}|J/KoB`m~+pp/GM%͜TB Ι,`>'6DzꂕdΛS5A(=ai]^}>ȣ KmG!*$] f5":@}-EZXݸ3QE1 &/?;/>C"1yb&7|.ҰV(^"dњ.Qb]t2STmŏJ-^isg $5pYȆv:DY;.,4xԳ3uBJo[sߕTk@(Z%9YJRP  zUBaEY!༂@6)rj }pHf`=l8yy v\> Y\L+=@d)U'vCp ݈+#+P)#|S3*k>N?{ "4bڡP,,z,X Ρ%e#"@n ٬R`P+^(1l2uXt|:tg_tGdžKM/ H*jzACH!#$k^+NtN}N4U˪RUVxH.S,. t.Z:t&Y(F.jl8ײ$eTTS {" Y90) MUS䋊fBc {)ZmRB]Q3ǹZB>(Mlb̾H6 EՌU^P5V-+0FO$`GJvKG5F!\d,Dv _O9Xe8u,p.窊G9PPSHGp.2\ Vmf7G lPutHG n^lZ' ݁ `{k(߼?Mݻw5϶Xy@ڎ-Ƙ9W1B1{6/~x68t# 1eV*RyEtQ3XuP[t9f1$)J"FG fSv 66qvhJ>?:6_ލU nɖm~ M>Z.f>^mU"puׁ‡!YƘfqKH3jVtcBST< },I2uTRޙ[:RL-Z8oj^,{^‡w}u}5^..Ρ?]ul@hw׭_T!rmo+u'oytNg{:]_llJNZv i=]IY =絖jG˫g>D毦M-:"?yqzJ4lznMwStt$OjFn<+l|.|aVg>_X;Q9 x ,d#wpw8"Gc`H^sm˄MHjʅd-VW!NJÞh[ fM3tŷ6k8) 鰻4Ԁ!ϮVM\PtkY/5;Tlؗrѱ%2I )HIkGud10CX2Q{`3Qf5t_7⬯@} W^~qs)Tr;31KGpۦB.^ȼMS{>,F:wURLЩ*_઄4/)mFD f7n >2e2%#|`M"T^/zlHBb!8֪dtfس捵(rQ']muBRkg:k\1afvl}D2اٱ0>qnC{+O?<K0StmL~\O/ `T˖쒆ܨŸcYh:;:' 3L.[}t[[[ZYJ%[a\ptAGThq^cbU!,)\_[ huһ׫Fn g{7Xmů7OYW95 tFSYyΥxkքr){""J0H~P%`RsUjY@ZЇ7`E 8At$itT 1M~%^EcLpenڱ)V=^kglnK]!h;w9KU-#w}&jyTܵGEÅ7>ͧTJ43xWs'ߴ``ҖY/8:'t^'S|6!K4o3:&b]\}3lp7+Y?md*hoa?$J-.V-1265c{wS0V8WI(_ӷt=4`>kIVw6BM!b&IYb쪑P>IY>ݩ޵%|uƋ+MһHu.-vw;1Eu^usvN<0Q!u֔:vfWf˛noF7tEB~IwqMcS4]Z1rF䮫%m:>w#ri6 \ ]̝xo:;Rڂ{nRK< 蠍7 ? ͔}bHTLZzNՆdгG[v.,89V3՚mWkFeFm:HIr;ɜ8n(Y^x} (U:/Y<,J?(L[eói(0ő %6ɔ9"Zm 2n0S䘈2$c™e1zI]<-5_~l6^^s>,($SEXH̊95~^alr*2{P2{|O8sV~Il%wBYf$3lĆ &9%VP˝!OY4;"d>Z70 (QGm7!0#XP1g&m5gėUl¯.%?$dܱ9avm*;Z4pP_>s $A O^&'**" TӒ{`16. U42OP& f3I!#a:0- 9r՜ێ*b&ZmjjNN}O. K S ^GGLCVfqX+fy8E.Ã-xю 0$GQpXG.Е&wlYo~e XlET--boBz/Xxk3A8LTt`L0lT#nQeNޭFHHqbKFb3[XA`Irn mt8=| s6(0(E.F n5c1SQf@ٳUbҥ罌5 beP= frwsۤSM=Ht{9p )P(wp)BSϘ8P=t:FY^1Tk%i8ĝ˜T޶7GYf*[`,g*.\zQQ,:} kc{}:So딸Q(^ulKnu;/V /RdG3r~9R.J[ݧnz9J-4 ԛ\SOwTގܙ*ٟ ꗰH'kU~~`N+_,e=](9 Ԕ 0eNY,a8BÙm3MdveLU?r|d]GTRJm ;Lg&zG#6gǒk0WQ%0"m}M`veezƋ϶Ah _5' Wܠ^-?fXS 3-d Ĕ#"Sk.loENs.m*5 pA:* Ch)!E04 {LCM 6q<,d ΍5v7uǿL[y*2ŎZKK ( k+lF!fin4JʻAw o,w,y8|Xb7n÷]; ; ofͿ'ZPreKw2K+"x xi2%)$Ťwpa(@YJkLagqXxN@+eQ=,рhcp91ҪA:$ĶK:M%T6%-9wsZGc<>8".{tH)A{;M6{<"k*ߐ&O76Ë<(ڽ>v(9{WŰ.LKpΩTR=Z}}iG=J˂4Փ?zգM*u]V@s8?1е8H두-ݡ;ëMb{x'!gC -RW^G_ ‰ `*aR/1",D2""&ZH0<)cj `jW뮾4Ch+Kwo?~r}2sc e &  A}0A*1j 3ꄷs䉋Kih{\B]@x ŕt9>ctЪBĒI)KCt1 ?O]nپ=}hnٖn3Wv2ng=% ˜ZY5c `IɃq,X "Y,9v!z0s벗X0˻hF((TBrrAz&A4w0l4c'Ñ,9,S<8'G Gc4)+`PRwdRw,OL?o~ gݎǫ uN#9]At'Jќ$˒YJ)!(%^֓ ?[DdP 0M 1R1$ Dx4+C:wKC2Qc(,<: hGϝ-[s6W/RKFI:"z; ʺXYi7}6e L:GnG[ "DpJ^()lFqA*_uP2ρ<9a;"E RrQ ܛtMafXT,Vc  (B' L8W= UR5np+뒲n"D8 Ji\[ɑxc#NYPG"^Emc/55mH3Єf08Ք`/ ԦfQꘖT{EA+(+R^E*Y+*X颅fC@j!M[e,M 8ars|%5vxw0#;?T4qh5.d0%"ɇbRxp.50`Iocc %e/ c盛K ;͚KCi4ݍi"okECrX0ԉݕa@cD0 tt*avxuT_{科 pU\Ɛ=rmCI<>,_\#Q)r^)zj3BJ5bŴxZ^2կ;o&g׋fcsn 8M]X Q~\6>L5a^2j&o鼫], {PDQcS/|T f]_98\rdaLWk%Zg/Uk NLa#bh0LOj_H+/.Fpv,=T;*@&~>lPMonpq~zoߤ߿:ޞzuO޿}ߠ.Kw:@ܹ "vhϛWXYTժon5@QyU.q ^]p=8H $~h9@|P}DW,X cъz\WQu7&Dt66RtIltTD~~ gJ;7JmV4m8  R{zaf\F{f)]I_R*T퍘fV5n4>vļnm] cZUT0S$6NAT21ZDb\z, 9יUΰ=w>y<n6>ZBw^'tG3K(̒dPN(\ѻ)Wg&R䣛Iq0GGsK8d\tv[VZDl9\脫ːKUR2{)f\\cfcfcY%s4:tNdd Wrڮ]@;' 9SSI7?|%AQ\sН LYeVSc(ٰTBErIx/#l}1 7vMy­6Q.4*e 9)Y2-)vՍ\mT2 4NٸP 4DߕbP'vlry<{6^M<]vbz\KIFd8E8Y_8QHkE}~O ftA8 D'94\Bj7Aru'=p,XC$]0g^6 x@lԂRR.caٰ=w!f%j {J.;S#xdclK@n:,)iSsnRA6H8Ysa[X6ji`ֻOBkѡuU_t ##$%"*L\&.1N]=>6C!BH%uM>S9=';~?($%{'.gw[bTQs^dlF2vY {Y{ciںM9{_hg;mb\~=ټϟ ^Y>g4ٳ96Χ&Qk9^7bS ]-`Eq%)@P5V&WDUytkkB]1S̶,VlW94 K,G"fF-[<x?:ޖSvDj޽!yB{}ola-&mɻ fW{%7EOMUX{c6e29i֠7ԙTLYXTb 4g1Yr#n .RfΝ!z&E]ںKxHퟺ1eLobѓ=mG^jĤc47޴<ޭOm뽙iܻ.wsu?~χU;V'bbIӶM;}?Az{Vn =g c^×B__kN27w 'Rt1g?]mHÈ%;!2%ΣgdrY(Hٽ(;|(;,({l#I8' 1qz7妄p.Ig[.@v)=#&,ؑɿϔ҂a4!ņ{j (}˵WX0X1UTpmylt|E A%qE{ۿSա+THwQK^2e[/ƻTXZ/l[:$g*k>PRuZT]\%5J 6STWJ [pC{i]]E#j8X#E#.qoH$ b]!S,ȶV.X\X6h ʍGgNu=%5`X2i@r,et-ayZ#~PUgՋȥ9:c"֋E/vo5)T,kS$kr#H!l@^|8}8lwՇ>|60MRaW~D0YMe5?] ꧜mi>C6t (P%;qىaFىabF'1*doqCV ؃29S|OI3y||*S!>S#а*}Slc, r&zD&>ut{WC)G',t'og=5=/ ۿW!,5/:u[/{q|, {7ܼ5 W cRZ z~D)VrMb> TBAt?X*}9^ow{Ynw;-zC:<cȱ$O!D bb[]fA*ӭ^[";HN qsYG9n_'y&-zI?ݗuJuEmԍw>ӿ}hwb_=Vm׎{v~w wý_J]nFL8pM7ϗоk$ct-RwpvqlZP\T>^ׁl~"m`wQaBJo%` [<34(#:(Jl܊3.&` 1ʑpS-u\B?J .*?TBܰ6g()$vYIuM(&Sjӣ~8l9&U_Tr\\~^o{suӕ{BK"ҟWcDzIGq-b,B5(/Jj8fjLd*I Elč:r̉Ceks:L/)&nbjDTZ1#-i ~grK]YnYZ <)*y,_YbxC[JyzqEh=3}.r³0"n|a]) i5+ ;(Z1|3S&E&b5eFCd{K`ウǬt %J"xP%8cHpC-4'udƧ̋jԺe4\o-COǍ}^ rNn,)&dL  TL)ML/8ˮV_sqӌ4 :!HRPor_z9舛3%&5^Le0z}qnu:ޭ~tZWryZ؊nGpO7%coZt}β᪇񖅻wUiॸ]|Ogxj+ d P(V-e,ӞG8=#Ot:^>o7 k5ptv@O+HrPch,h4'<7qv͸o?7Mˑ|eNlxp -\.dMt&HC [E#|Ű(WP {/  EҫIt`I)JIy E<oNvZ|p@}]w舧U^Q|owŠDɕ1\K bJP[TS!jcQ,gY4ϯyWpPZE'F9)AUm\Myv#5tok[4ϨOg}߿Ps& WAw+k7/cjWǜY5{S}ŀZ`{b}]~w/BCXC2{(:zjQ?F3]koGv+}J]#,lM#d8b3j9lZt-$gzoW:ܪ{ aL|i<^t+ruZ޴Os NjQtHtTUd՜R$v1].u㇜|0[?]&'h=|~vL$N}DE2"P I#pSmN-Fz}3Nhqvx#d3 uMT,j8-s)8kVSG׊R<竏J~zUB3-Ͱ+iC'Ӳ+ټ) |O8ۂ:,JS$uVb\HW,GXa9|.8s5AxF^cէObuxGutB}ت//3ϯϾ?]Iͧm:c,/߷%=i[֣ ܞ7yiKkqƘbX:=y[e܅_Ϟ< -?o5r8sg'G;_oS ۓBk/Vo׸KfpDd9* iu;}~i~5:0apNx1 l]=oz'ɥAZFAԦ\MN;ԝ,9/jjiYT-*aguv.Z4)bڗƥ^2?gsѳj廜M{x/Kjk*JzO=ۼgO_jnDOC)2h?rz>`k|쁜 l} ?_vd?Gx?>`Oyl6ԍWyb(/l3즶ƽ Lm :f%nᔎi&qgfjO! kmMUxPII\#0{L.\:T{}2ᙝ-ݯDh.2b:va@%igǩ,HUƓXӡ!%cNօM,iAO/Ƨ\;nN iTE8USI{k%pN4dEoc9\2'%d >}oaN AuL+j+1v\ 6! a #MABX' @$rNc8bGܬ ̈́#Bl co0f|)2)8ؙ7O@Ŵ_ `I. pq BА{ T4gBQ".0&EVWf֖5:R6No&#!J!r{&j sʺ@J @/[.QOψ`#/z ($dAIh8HZ(  ^ B1dȏE5p4Mdgm/FkᗢH3fD' GcDH /0|nm/lTSU<[Vе#[ːw ܦHB~ d9@X8pi76Аf%SdJW=$ @+XȇTxmBV58 E8myq{'*`tx (nGdp6 یEVgeS Q_բ*(6d' O6VE}{pڋcs6q'K =`Ye>bM=Afޣ.O9g8ge6 /!B.10 zv50I-=F AnK)!5 Y$qG:q r0]Zx _3H[vVhE޲N+}" A;Rͱk?m,  35)JL GT ݑ8dDYUõ-"PaVB]$1dlD!HƆtĪM`k>ZlOtXI3,I#k4ZI@2xnJMKKKoU4^"ƂZF7k@ QmyZR0 ZUKvO.Oڹ;-Q&]lע&랷kos SRId(`OF kǯ@H9+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@HaROG s@J X)=+>G%~e%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VU>>%̍(t~J X)5+>G% @b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JX dE))и<%̕(`2VjJQ dr@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JG tQcoiW~./y}WM53Oؚ/jWs"A~]:%(yT1ymRI.h0ϣ~/̆ 1{e,:6w',h -ɞBȔ9Z/"Z0mRES ,?i9e.g tWGй(`sFh5$kdP'I#ԟ}x:N ^W"m;m2PMJ4x*ϗ `љ>sw]uSDMa:^?%ӕ;:/ yuUWێ%Ņ?w2q<{D.7{nw; phiOkbh͆0MfE _}59n2uߖoשak-kFE-MnP7\oڧlyf ۻ#( q8,Uث <7<` R l#\*#fnF w)A')pvK3 YAM y]N§Dt%u#~pMP͞¦~0m%I-_oxYbwqSxSRtio~ن'ɷ7/T*ILb+R5#"Ā*XA-wڇt[<6*sgٺ[2*da6he! =JcO2ܥdw_;z|nWǣeLV\sLL)掫d%-<@8d!S9Orug7vi]Wx; x[(9\J!^ F@>Z</xvyJ(tn@ܶʭ{RcaQ;Ljm矍2{WLwc&y&s<VQRʪ>1qV{tHS6 NVF߈WpŤdԛu>,JЩʟ4{eg {Y}&}ߴӵ{? 7F;دp'jdgӰYˣi66I+Ϝ%i&9I6t3G+X/HqNKpJ* 8nmwbw=Ü?ڸ{T4 X]^>K2`JS3ǭ@P($yZZxEA:*o¥*=\T򠕖N~< 코#xFKcT=ƥ >_ogv0>֝z`Qo5Gf2K2*GQ1 22A҃j&x&a0vE*gRy"'dž8 d%3rp8džÂŽ_A6:_Yh5g, DaRvB#dz0/aX9V$7i1SXd%L)'K*Qj}:U&@bA[ :(Nm*'q*V9XpeA0P"xPf9&h38gKkZ88G#xwp|߿m6$ξ6o-y&C1>DDKi^h3%^ l}uSF6Ĵ1HKEPSf+1+2˫0eRSwI%v)t#-B4,Hƒ"hRDI5ߚ_Y]oV|VY:fߊnVy)x>/gY/ 3PiU|:kL td0LXxSƀ>$% XZY4ظ܊^f9MF hר0s%LM)V8,+7nYʘыfhhT~zK>RkF/AONq abb(`WVO6kjtF=p5 E?~as_/WT4s,ׂƠ׎0Ti PޒCRܶz9jS&e'ld'P¤Z8.$HReVlL&wqHf^.!1h55Gpxi BygL1_pJ#V̥>)! 6@IᢐXK=v%xoEOi%kyӏ4Vg&TIm9k|p{ Ld8oPZ>r.g Bӷl`9[  "KADoAsaŲÒMJD-BG>8%\EAUL݈=&E!qkȃA&.;5s05. TG Z󔧂*l(k)~ Q3X4UFfZ¼5}%UpTA rپia; ]. ߶3% ԼX̥q~7* E٠ 'ARf}~ .(YJk iuaI:MED3"i[cRYtC ^c.*5t1v=I)[wO^/JpE`VV]<5p ʍVfUvV4Q2t_0s SlI^rEbHKΐS|҄ނS@Rs-MNi)~qLLęrW[ r<}jrKc6tz$}>1Kcr,tVI<~$޴ӵ{? B:xzByM\cLX JEh%VE8਍4ªjQVvdׂ5~/ t-o-z.Qّäjf&zZޕqdB蟙R.±u07g0"P5E2"eGwW}EJٲZv'q,Kuj;4Gkbl|3='J8NzAh8HKU6 !57:E {V6x$yq zB*0{Àh{L#AM6khU)SJr[E)ogS.\s="ɖj.M϶fKw/?8N2s5P;i @Hm1Zk=pf}a)Lejg|-Q|-W,=dZi,') y^G@ZTqE>H<M:{ܓ;ݤ5+MtxvEngΦYهmvf]^[ Fד]i.Dmǖ,*„ pT[Xoe= +Kr+kϹ|VL`$,/\N2 Rb;…1 UG\햳H ƃjǹ2D&B] 2km:;ZY iW'y&CP[qܞ禇xg|_M-r3*Ed`*2 A2)جEV RDfs I}e&Q+_~t5C@!y )R'F&Dzi 6(!V*h"sy΍NLXy*Y+uYwJRH$坅qJ&20-h VK4^jBѢ cOOBs]I P$Hr8-x.J\WQGhd8`7A3g.#s捲9IN[5JN"QH\ |a@3&ߒ. 5[_Z4;D+Y4?tஆP]10~[)Wwɧ2GiL2$5w?Qna6>4=un_54O;֊I53Lm'`-X0ry@c()9>W(v6nt}<8#SϚ$RVQ Fb!b7d gJsf~\KϖE[`jKARjӓFb-)Z|TGcS+ʦ˥]߯PWG/ϖPP nn=5X^SR=yS>xy5_jݞ #Td.q W{roW.H?]KwWx8~lI#<Ŷaa:*xRʨIU1kxtll9b8cqlLGeQEnuZ%%-eoHX |?\Hr&rw%rѩDgp\==_㻳^~]go^~2}Wg޼D܁y9|gQOᗃ&5ikև@N+. _蚒8m/ps]S |Xjd5J]ʊ= |Ŷ YP-EM'̓2jL/B)B4b@K#?;Hn'b+hNX>zi)G,a]ZN$ˤh <}?xqHLHkm:;Yna7w*>?*+ĵ{ˣ6 yE8BHFdRmxm/kA~bWYG΂i BQgdbRyB3b'RxRZZ~>NZ62I)hw(Z:Nq$C}RLf*X#AEѩel;XEd20őVc(x!j&8j;H0 Y4Ps[{7"8fAϺ|Ⱦ'Zfa *5Ŕm]ĕhf?-5x} 2{,X'pGk/wȖC~WXxsGؤ(u/qHMلIԤ'fSl%<8o H 1IM%Ev%9%wلLՔyMΔy>ڤ$:b=*-$ڽ4$`OWIo,Ce9>PBZA=n#`A -֦s?Nm|_y):S$<]4P]d`V w΃H4JdJxY,>oL7yLGGj Z&E51HŮ fwnq<̞ݝ^ߍ130+!d֜^o>ߡC bmwv=Nj"OKײόk5{O׵޽t^ե6NΩͲ}P^ S]^?1NtRLMQ`[ϭvgthsםYvؔUe7 ;8vKU Ζ{X)[6t\,UxŊkE]@03**<_pB`5xY.')/$LLip,JBd𠥋PodR"`"\^ztӕJhq> .[?OJ:+.&) Kȃ7M/nw[Huy<3\ΖTWW5y8tACҨZ9-2kztnvsr{%2AD2Krev*&VӨA1~;%Þgv+Ձ^| red5x!sGEYnDFCO6w?N*P-_(Gr|U;Y}kβUyVT [? !T/d G_UsJ*W?!\^4ako'?_o\]ægߟx}lA>l9lbnk- /Puͼ,B^ 29:B]Q39MD>sіߝߺm+^mb[9w}?{K{)-*X2B0SinS}34QB^%-nUl }M#(gӫ/Մ\,O.'MJr#$}M8߇Go[Fg;9˯n7ko BÀYo2{5p)G-}}g =V3ڔ궠RmJ#ErwoMǶln *L{Y<|l&!k2jtUF QD@'-L̬ȫ5MI;EuE(f"##OʓUq5ZNdl4?d60`gNw.GE QqW$> IJ2 - 1fleW2WղlBw 9B#5g}VUiRʖrfgk,VF*FvLhsPpVyǙpLirµNz/]9-NO/TCT)G!qNVU̵`RXu*ƪ-pL`7Z0 Yu21ZsS\%Xq36[>{ D4G0+ݽj ?ʇMfκh2T`#<%} ڸ6g1EOڤl(9ho*qJʈR(j́s 3zU1Ɠ&u}-wSa\Wx7,#vE)1/|-F 萚X{)b7bgbҒys]A %ŔdY)G DDZnҰkMBՊJΘ Du3V[`sp[[bD\|1#9IpV1@TRXp0cy7A3lDpvܶsQo:nSja$5 /d@X8piUڃOf%Yv*ZPL.HZʭp1%3Hv9-/5(#@H1DN W$2V**26VYѴ8{  E@YR:[x(nEf06 ࣛEUg'eSŴM DU;͊8-&sJ uYI A;ٮao Ʒ's6piA/&sP6Db6y *:@`QV]Q׀R"de`7hOi P*(S`qnj@p L 1A\;djy=)g @)kFV-5*8t(I B9viEIAb5E!ܤ4̹+ cD8gQʐBPDYl9i?cy!Գpgi&0QI RhěWPrV-܋U]7XPK&-w (afsE?ON%TaҹdRac-A3zyk:\WP' W`!0Fʟtل(V2Zة 5RŢ4V5z8YUX~ aQV {V19EKkLa@jXf=_85'EiLA I *tD$Ơ~\,y~9(Ws΋6`jh/t/5s%!6 6PX?˙P^dzΠAh(AB)Hk|T'=&3JOQA,2 \ѓJnpQi0?li3ĭ7VŪR盉`ebBvf%dG%| I0 ](I)`$di`M(uiU~@ޮHD;@3^m!7ZX ";8۵k+m1E߾}IΗ 1t.YJ %q7[k <)Qm|;{U\;=y%P{ʞJ@RiRD%qL9R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)^rsR5cG s?FS)`O^ + "@bOJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%U9pgu&kG s~6J 'NV AJ\AJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%UijR`˟lg+`%zJ 7@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJxi֥V|7;x;oKM[ewqzATJ^r~q\lj@4ڤ輤߯>3Uwm@P]0EnfOjjg?<?>oYa.}6;[deMe){5 2??b'MN[eCޝi2[1=ws0cmNVMˏ2?m϶e>WI)c7vMjNB)u Θ6Ȯ:;QYx)K *U yіU+ߠs8}WUx}Ύ{ Ӯ~ /OЅn{P`&47} 1=]ʽ13A9{'(\h=a#Fх0?#{,p qsn$N;47N#’}D}~:99wc2X7})w:l̃aӔD\3wE}|4}\.WI\dOЫMvta҆;ܫV~mwpTaUow{GĶo>Ut|>ۿ{9WPOb۹1bn9i\xnZ1hN677OIOk{K M>^Gnι5ݼF3W_Y{1%r^Ft*(sʜBNe4u[{:n-X#רɂ>P̽L'l(+Mw[sRXʶ1 &|w~7l ^=3'GN gN;':n-PI[c}tK2$7LjF/Ȟ*;u:9bS/.^~֝Sw>E)j$PWCzE-Z:rk]K$ch~ڽ?Oqq]W 㼕,v\؜_֧F|n &IL=@?Zi6.r>G޻`/.n _W֮)U ?lsX﷭е6jw{燰p~/#+D݇Q2r%HXn 6~3|I5G#q5UݿztUmqYmѸ/1ka9o𻺵N-c?6/|MrA.QjiK~V3Im4슎~R5f-uPR2FAx#._\M:MvذtY%1j`[zu|{\#IQ}~aُpj#kfXDwRu@$ L%0Y_a[ɦ^Vp"$ 2:%P &XrVg`r#en5bُnH}KppbM\z8Dt0OpB>$thII).,~>ߏnٵ`Wmx)MlLǛUVaݞSKq5kE_ۘX)KS7tL&[WigJmcMfeÏOShnp 56MrwZ#K߬?O߀]?;|֮PwP'O ^5=X|0| RH U4X=Xj\ I _`G.>x $ l7Zm_'k(57DЇrO Ŷa^N Q( G *YOPJ+V/65]dxȨ4wGg%hMN>1 eNP V90>c d]6vw=f,VQNqm 6=ۈ kZA/w|gw?Mfo2۳8 *IT)g~{Eu~縫xV':ci+0T^Z*Jv}о|%`=!'tG>SҮzvES-QPGjAjO{8$$4||'T''&*D&`@bn$2M l>d<1HE1ܧIH.ޢ*阄r:$4st,B 3rԲV_Jff3iQ8sAuqKkneYnCg!w?I&'UB⩄oäH0Q[%$l(DNٱ̔RćyQLo:}PMIJv_%˻,u $Q2Qԛam9xܐD$OTE 7A{ACf(bwiFXl2R)1j)eʡQ, G I[,je4ʔ </xnA AhtRh=jO.Ze H 1giz,ʐȀ : 6q (LԄRIp~=Pw5M'Gx3 x3xb (ig\)D;?>|(0/}H.k5wR磬yNmIoFwB؋g^v 9f3.8dg V\3? |>_&xq&Z<[j<gXBӨL`Ʀ:Ŋ&d.-\M߬fo3S\$3?|HGHD3ow _:uݰjN7דy{+rW_*zCg{:BZGK(Ք<3oFiz4/56S<~7}b㚯𿍧y2~[sa !Vxs]~匿ɝ韁[G}aD0δ,? 4{Ԭ)LSNty9kzwTv|ɶQ[UsyfJK)+bή́7sڿX͚p{Xj6Q_'v/wwߟ}~ ~ҩ'&| Gӣ_xMN.CK wNo]+x:kjǛ߾ƿd5^|X"#? QNl$GaEۄlPł /d˅hwǵҮ:\4,a@N,zp8@G{!]O'M W'PLX>VkdՉ|*p,{%'=aTGCσwX"%1;(քa&cxҁ. 8 C `NJįӫxy}b7= @.QXpt: AŠT3<@ΠV zҠ':ר9OņbM%=J2Ђ  0ѫȐ j<|8:8zk'_c;D+tF!dTܴBIa2Fv25q| mYDR*B"|ȱ§:U~pws+D 1CqKl-wB: ѝxXf"qȱҡ;5Ao` ~rIU7MHe@s4;,I3_t'Y@]4rF G (/",{r$-_G(I<(Ʋ,\ҖM0Tcn%H,P$DKM#.7tlfG0JYEjwQd 5z̀4Q}6jnAw3kZEYb^BfJ M,9ә>bbdiى餌xtIr5Sl¬C,yC:Z{: Eʶpْڥ둌 _tKwnA[ۍA94@zkVM]_}*ɃypU@UX!aX8Y² e|6N*sޟGrWv aLRBR)=AP&e$jC~w*B[ֺ:5LP7rkU"O#WKNgN YE0HYk#,sǦ/ro? Wh-j6Fq>s@³ӭn_%]06i|ٛ f'«\0m;ǫ7z|ޅ{:Al  3P, = jqH#9*1n5%19VpAIM/jR)ie-w)E.18@"ɼP3fB]~v S;#@+h7 뵛+Xϻ1ff5d͗pUM_ZY>ugkVzL's(;I6 W]^?AC.UuE/i:=Oss6eE-yn~v|[\!gPCQ75A7?n.8Hdh&iyVoQԻB@]ui Jr * %0@Q3֋+k|>i0 +e:OmFEaz̷⋁n {wd3Y+xWlvj;eOS[/-9( UE"uNCnF8"-ӫo<111&>Hɺ%;&I2B36@r椵 F.] D0-8*H)C R"SdF4I^#-Y4HC9v7,:"%aŎC6x\;ORyDo!m6'ύ*GVHlyW &Ŏ\{5!d\V\gRJV>W5oCs=ybF94-6⃫:RGDM356-GhVY ȬnaV%֘9FYNeR*ɓ^Ȓ,/hVfwikZ=d~MoEٿnoV~+q~,,ZT\Wxf\xĝKpEK4#*ܨLQ̡%ap؇HH&*JAH}Uf!a]]%!]tq|z|+k6Mz\WM|5'bEw4?b,l .~88p̶QQYD,vJ#I}zN{Mp;zr>{JHrDϹ&2'P i襈KCÅ ¨C/o7!y_ix/OlVw|{nZ?qOZDA W.9D)E g}ɡܣK4O0"^n0 "hJu3!BV }`k'ֆW7 Fx;i<ʔ^-Ǔ->s*l_xŐ*e22s"AKX)(%#%43F9; =#|m#]RĜDF:!]G1yg\9)IA-̥!4yjx8+NΉWDTրK=(u_\p1X!,Ii%rNٮ"+H>J}5`I }+En?jB6uIfw6wpsNVK5@p-Bs+Fb(E.eV4c-W}k1 J\tt:=a/iJ K<XsQ1B-#KKۤ¥e$huh?oI'4ͭ,`ᶗ"QG&4]V^ƛr#N~*^>S~/o 7|9r39nOu&of: Cr&|6oO{[|!u͊RQ.vp[6^b]Mb]r[.c7w} 1~ig Jd_/ʄ]Ύ&r|Nܘiay[J+B?+8],kg1(D{ɿ!U6Mwؗiyӓ']dVC/JGec`y( .y<ýcƚYԐ@\`1sEI3AяW im.D^jkymm^;ax9+c1ٞ4FGPtavje?kUeŔE3:¤u M/l i될)dc&-yUè* ^::qda$/5#.D& (L]-,ftSl}򷡙f͜b "1XƁi$ SI$ɲ/7 jR9] d]L߁:1@Bbi ^t 0+ jlt}ҚhO;b{p檨B Dmt)bRUe0<ĄE73H[sbIJГ fTL7 5sgX;WVƈ03b{zRpeƟ~vypPe= IklIܳ 'τuOYYk(ʞGdBRU dCIce\RN2֮3R;;OqM jW[fD>Sɲ8nE+9.G۴9Xs$AO:#ЍCZtGZ$a&HERCڅ.ug3JˢPmeD#"QA1x|ґ`lTFfA \YΘ6tJrLJ:ind1&-pE2"VfD$T|HjdW\tqэ8c+KI$.%}#FR{Xhl qb ̈{šaձ# V=a^*_sm֍bY}Rޏaz"kpW}E7*$hBI,CPcT.\ȅ dVt@\dI&8#YC"pRXݾJu{,;:B)QS# MK3cq/:iYt8#ŬH1aK oY)mn(2ʟ' ?pf6NB?/vSZ1旉/f5S:$Ӈis0}V]VmZ so)uSc!({}9&)<qh>yY_cjZE\O%G<(2?"46odq=͛lFuJ*%_ .9UɬBêZ ]k&,JGݕۅ*7oN떾sq9.{D'L 6jkin ?o'!9}`y>ſ;jp&[ET$榹TT5r a:w›Ivh,hd8 7PLI0;l7 9r$r"sMhҡ5`},;3~E;Y2.FJܔl>cJ3ksz-\I\-e#z瓖)9hI+Vs;A'MN q~lZG Fd,Y^D|;*L d>f&3>;?.}A4,JNA8y9#a__ݾM[z39]v:iEWO_&-&fq',o?~ptͰ,QE(EYB8(23-Ȕ.!cQ.S[O˟sx+_!(qʙw؍FXt'vxwhmuO;+nxy)DQ dž(+QYaߘ˚Y~/#BSQQGl0vKKYNn3 >a\D[R I=1&K(.K9Ru , [ u3EzYdFt$qҸ gSBf~Jpej"墤ZVb=MZWzri:RPcfG_ Lww*]%B@"8^1m6BH"ܐy$ DC n.i+ȧą`(Y#.@)!iL@81Q;[fӻOA$)[:`b0< )fPY>Lqe,.f$&=|m BꪐjŒʠcb NnѦ d>'ՓޒYQҁ{[N` ;U0bw!ɘM!H1~TPqD* Z P`7zሳ%O@MJel@26OK7Uz;>;bF ☤ŔJH(zQQ&@T&X%Ut9mKnLT-9p <#nΜH2wW!2wmmHyp:!Y[3y[,ff7cdquus0@lYfYůU Frו+vx9˗sw ,%n~2,ތ0|-BoOsM;A2y囁WσW5c#dk~t=]ŃMQ z643g(YD+L0B`*^Lŋ&v1i-2ptK}$Z k1'FpPrT>. gܿik,Hxnz uw^qhb]ͽ+@h;W3@à!jS4bZ￴n&7wOs0[i;}Vvn7=ޟz~mi =/< ͎n%_y9ȒxN;4Ee~亪+wace>5g?kN;rSecPv[n6?md֕ºRXWRXW Ja])+um.ºRXW +u֕ºRXW])+uU#V Ja])+uՋºRXW Ja])+uGº:Ja])+u֕ºB &˺Oy K@FSQ[5Hdިrx{tweu9d{Kd8ϥchM CtR+e tμ-JFC8 q \z:ݯޤ08>ò M]Y_hMi-MyOͿ_×Au5f w^GRt; Xd~,څiye+OEc`:xcAȤS27r-,Fz/xAI2DO, Qfxfx(lbKBJ!5ʌ)74"GxS[@2V GHD37{;ΎIޅa%~FZ??>Omol -D qO76 5NȔf0;] d&De[2ڒ.d)IT'ێ%cIǠB-o_ ;[{qIVg>qM&'ǵB*uDvg.kq;R -̏]|ՍM)mIJ]Q-3Y7&cQ\C"nEKFt7yѰL?=*Zo1X02 Oql2g /=@Bga$xLk^(4D"#Y @N3V%aYZoi7#;ԼPuD2;C[Kg"XϙbJΡyPn#Ht!`6jE-4v:AmD MֹjoE n!VnKq rpBqT<t!"td3&-drdȒ,kt߻[]Vz;vpFM{lEY\ {`arɈt9-)IJHX `Aӝ 1ט Ĉ:ey0 }h Nx=/iB׆^g.s5%sN_0Xr-es3+3jѭN NeelfTLTw gi0oQIbit>0\"C'HI׉o࣯`s>W<"$nvY~)ݵ>A-UpO/pm+fO6"Ldn0> )I>a6$wdH4Z%"m&VӪzyol BN AF8Ol}bO7pdI&T(rNٮxL\$*Ic }.Ct 6@Qזݭ>@jX/uċOhF5=s5ΥJf&&U$Bf`lQ2PsHY/gɜvUa>Q 2L׊Al H1UDIW>Y\_Vewf>h~߃-|Z 3%DĦpПbqHfG?xMzbϤML=׷}Gzd"=ya6|_8iw7b=Ї;;>=-ngq lֻv!Do bM2W"ng)Q>#F{:cOh'i4]]ĦͷŊMDB潞#pmsg*pVE)<tg?4ʭEVwH7 GhPoJ A\QϽ̲Az^#Qzma5-r/ ,9 ۂۍLviuHsI~ڔDx%+'>q{OlhSSn1uÈbީ`6ǻX] hCZд齅j ǂ?Fib&4T.43wm`8At)C=!Pϩ;4IJ WDxPLfb,69Vk/Eu7gocOE#PꞇF/ 172p'OpLx- |8%A{hƃ"p-䃢&^gY$M&X"t̍w%GcM;'>omT] QJB'1 W3ߌ:/,`i9%^K׷cºZcV|c+]>8Z"Jغ8VtttdA}=D/_BĀcI<肎8jIZ7Y$ X- OV$$Pow1\Jx?J!7VUF]RؾYySmkkޤI-d4=R<;z!'_ˠe}]<{kkTË#YDm7{I⚶Ã_oϷja#~`et Nm\J+:i9%NmJS[餤K@@6^p x%K.y}ZZ\ 5Ӂciǻ]368NY׻}Ϻus󴫻+~vCۊ΄vzԧsp5. ͗?x" ACڗ7HCcH+kر?k$4^/J%<ԣ14c͊ztm^;ZX9>rI]hqKL BUVx!̂+ !mp&v1k+>AQ ]2`jœɄR#0{]xXrZƁGŽr91)PkK E <%\਒ֈa4Bv40mBu i/9~#ŬBIG4 <(fi,i.!B2 '3ăg xQus;g+iAcë]1V'uDZ?;>hns8]Ä`H je]%?+Vh , }녂KTA@Z. B.anMk:iRO엃0ɂJ(<ic}dU訌 "lYfRJyRL_EO#+ͦ͠bZ%L2dޔ"Sl,ik4$"9@ Шý+% ʈHܣº1'2 cTPr,E$ ړ H3*lPdd <_S/-=UI,*+Iz#Q zP1GxZ 6P*H !A˼WLi 00[V) u à"*!.5]ioG+&@F`Nl "b? Ok䒔byΐġhrdSd4ꧪl5[7.4 {ǂh%KMG{?sSmr &]FfSILqKY>0[?ͺ";]UCԹ$p+x=Ժqo{xeS@c(19~4^tL8LQjIYG'$:N(Okdk?¡]}J|rv^=x v*3Cwf9+ j_.gϳs;\L !M-)-]45Ú1 1Q8yI>뉞._gVvlndSc,tbM#a/ȿR/kO1NwXT JW_||ų/_߯/Ϟx}F>{gg_>S8NCQ{w:O~پiꦹMN+._5mOy:=Q5ay $a+s5iBV8T kQmB4ve",D# 4ȷP. ,'ƉghNX>z,ǿwr0gKR |$IL*NN %|S0#A9M8HPrsP;ofg9MXՙ|M)otaVX+d#}@YQEN +4/pL ]\S p!3ikBJb&9ACr2 ɹ@_+D&#HD9ξ4b+6Ly4.jg1R"08v L P$:#si8:A''ur#I!)YUG@2]}вYv(>];gjʼ{L%}"#o@Z&cb(Rz bhQLGcDc8!/@kC5A[MA=&{:`t*I󰲓rm-ܿlA'rjFp6"`)keg/'IOm*ۨRv4[ϦCK*G$۩/6jDe1Q1I9T$x@BU#g"Ⱥַ̳1 nZ{-v4R]42Q"XXnDK 9'#yjY5Zw_>؟ Wכb\Ȑ0#=q_~gܙ7E}= ,? <;C01_W__Ӝ$t0vqJ4؛D7@LW ^U9 oVւ?X}5p`|C US;z 0ATAp`&Ze]B)P5\**Jhc"1MP )K$ͽ A3F8F S[ΖZ9o>$EpA8gY[jwכ1ff%dMU[_[T0n@Y(rgUuf$\T]W=:}e팼m'7vmR%\mBεc1^LQ`KϥytT(sYV% ,YTuj7Mﯼޢ敒d4P_v,6{~4L]b,/_eю7\,f=822SHgN?;!3w_H6hwI',ȴ8)`0-A Ȃa :VN^_#"$xo-?ʖd]|`E4d}P JllcT xE`Je$x,i\8scc I2& O5Zks 9!kY(m; !I;4i)hz|?0,mMZCz0X`}a k4x V&Tx`P`|AM/L5)p)"4k )$,S}`..BŅ: g 3>qg?<L+ZkD 07eUkLH3څhYÁD+ՁMĴ[Ζg}1>\D#}:N>.y,x O{{l9h)@*aJeDEpF|¸ bgUDL$!}>g 3),CqT9IJ<$@(/93yιf%.$P4gȱR I8ƒp-z%po3)<҆K 1. >IlGA eaQvqsljHޟoa=z[_[;f=VUMwE4"8OTG(d,M]PH(jR];xFyXO#8y3"@PR(AڊFi2.Ip. gXOc8O[y߿[v:0/9= 6i8lpqДx!H$b)[KjOfQpQ}kT$Kb -hv[p`N}(j[;X{nɖRL<pUnry:6l֜weZ~V@GuZMJbR [bS-RPy:9[㙗 1L,l_<,q|l]BicC(󷽴ÐOe;Ia_ff1 ښEWa0<ʚ%%wLdeWeNF2 #~%iÍeB[σ}9O2>i'?] xS_"19MᬍX&tޏZ[U,u?HJSNwv|(buD_ [9h8vo8WuBŗysy#%hQf=V>:+I kNînіu[jXf9R9r`b7T0^/_کpa4,8-Qhe|..,t,Pf]t%NWnbryQW.&I?R"1mwE;8](:vZ`W3{nqCCu,3W2>48|78 F[[-%NR]\PR> .tM qQyX|B'xe0>iAݠh̸*+>l]0./al.M,X TlHvc'N{~ҥ_ _@ ,"25_Gm qܚ5qc1pӹ,2H_2֝~5l+wjv>uIJ[F_"} <D[ c,Z1Ɩ g遅xY)ΡB7`x=;ngS)1Y*Y%:d! 幌JI@gbd:aW40Mzwlk[JΉkJe\xuJuЧxJ7{<ǂz6㋯s:XPIlox4uHʁcCq!bpr1$K,DA8 g奋Bh(ژ/ ȃI9Y].HuˠG{kW6Y3LeAu@˸Z4DA#S R $e_'1V tYv!l%}V|"AH%QE4PyVg@M/KJhRɮ,UWfP +x/*EUy G(w V$|13RQRRDͼ04EښcFdVyl,PJSY3oR\KϹHնlajf۰. 2psOnYqefiw/hh44Ngb;(WBUe e56$MRܳDgcUieB|L" )ٔ HM&nǬ\sHck%fq\biǮVTڦ= JKh,kqK,XI͹r*tD֐t JOQэQגXtɐ!HMl,N8aKQ0EZD[""X M:uD\\xBo#hf 6HS$dH"3dRG! DCY{oO5q6cXfu }H T'A4<52̿+>$n!]z<0m:=RJJ(W8s yoPŇ!$ۣ;Ph(wO1mzE;k¢ޓIHZTb MA2Z[$Et\Aтc\+52NЬד=JsSoo_N`SkIp(|:M+cS; 5urkooN]~n>u<4p J٩q{g׀gc_y3MwQ9 4<> 1:R\.J119'{ =/k:7`ixSWq,vtOE1˷Rr1$:| ; zkITu !Y Ąs q:%i G`ukS}.{q{6׻/WE_*'29E Dx̑H72XB* $œ9{$*"WӉ^upu]2]']DDQt-ŕ\kk%UR;#Xb4 w7LVr,z"Xz+Ioe Pʃ K,6 -> @Kc,0#D&Kv6Jm`2ʪ i R*e~R4$ZTIDf\FxI%ZYWǯI&͗V][7Vn]RyۤmM>x-׃/iYr@& TQ S6XI0Q[%/Qc)H1GS15t ib{c^U^,u ULBY7%:["E"2.&Q?X ֭[s"RRfQd>1h`Q-(S&ث8TS6I+0,j'D(έN wڬ%8“p egVgFYHuNe< Q3m8Zϓ%X2}H̠oCc@Y`B-]jTw( &?ci(w5Z_|Qǃq(i6|*1)DdǼ`ο㶫ayd[9aO28܈(~)M8@c&8㒉svle8i:ї錌Qfqn,y !>1QP-SZč_CdH{R(gE|w61ŁӧCep[C%ٿ,[}Zn#՜7%Q*jry.o4W'WP0ǃkϴz0MwŁ0qWœq ?V]s:Oxչ4 0Ւq-fDw3t6,h 46zb'їeGO g~+Ow_Sx"gHJԣLy@ vM ԩPk5޿Q<ʴc],CP8u TorCdm#]%tS *h*V*Q-b2Rg ~8Zm#HUcP{=XCqqÉV޼"\^_-l@ vr!A7|&xs_>~J[Qg.eM7SÆ:KV'1N7?(ylEI';rP{b3]ML-EPc˭G˦ϳWc:Y^ǘl#7q#Ą1S BlvfSf^d59"hzSۮW+Tf+"ƒȴX}+0l%{6&ipܙ[hqY͜ a8l5alt5ODzX3S=[IAxD+^+sm"dWK# g"6Lr$k,R%|aV Ce6.g'Ws2wOO]udJssA_o=ủi6&>7\fo(:٧aߗMgz6nG^9͛'vTbH!ϗ@]<-p+9юzL\{u"}ч3S}|-}K/>r ~T?g\pdu lr*~h7d{h=5E {Q(>_i3*r(\9)amyXay\?I$_mpvO~?Z\~̵zŦ90adW]h֙쥕Yɐ/ȷkn3wGA0}YY>t f/ͷ\huwq ֽ0= 9t'jMݜΉ,&?vUhk >ecޞnٜ?:S$ 8Nn|Α|w~;?,/ѢK˗] 8w0;9{:AW<}R5nHAs+o8|p?y>_|}Ì\llx[{h|z7I]vw`QTmYׇ}vj.f͘lgϚ25]O#m;ZcCvlÅ k*,ozvhҼ7\HZN薦W}G9limp7~6NiL&㙻lsO ΒCqejSFoBh-e15MLuClO0%6zN-Xk@B'lZ -n܎޾pHKszu>0` I(a $̜p-w/}n3 ? 9Cckss DG538VK11/0pf>w,i[=}[MfQ Ɠ&ץkxT0[ńŨKlyB)➷ʘ1=H Le H&=rݛ[64ZIM6!Y /ȒV-?$gf VdF]/Bsf"J.klfHr3ZSnu~p6e&F8bTx 8: C/kfʒ2ÚƮZcO6Zj]lT4qT,ӧ5Q"5.UG;y$[P@uN 3:SeŻ@4Zb^g,)4REm1x&X<PT@tv[  ━9|U~,#iU7` Q*$S@yk21Fx yMpؚ꒱-d]5a6ψg14@^ +HAն`ɄFTgsv иqH}z2:Zw 2m뜴ㄼ[97~]<2VeFf0d;b6q%ټklXy{꺈ޏ܁vBmTzjGdzCK/F ?Xڧ7lN(`BG!`9y 5?,[s6|71/ ?ͫ5 o9?}!+XDhL.ȕ6z\3>ВB*#n DI:pF4tqd2]5" ^yonwf.#sgkg.B[Chg߻k*y`!-:"*c{QHuZI] 5\.“qv~_lx9ze@]LEc)pZ1sN?_nz&tz\Q0 pGR 7`gBp8lLF"O6@=H!w6Ҭ~slnG9@{ ʼ)|]/۾zӖqvb» J{XMrI/_2eޭA-k gw`f5w?&ov~ W9wY#^ [G q~ Vuv7Va\FT,Cfj&XDhO|Hq%XSɐRr`;ǒ6րXU1fmY{6uh~B1T~~5^LYL Y^?x 3B?w Fm*.ٛC/ $gUy+|񪆑9o(Ζ/&uᝪт4"ج:cvE" HQq?cy8v-ɬ#7N6j̑m`uJĢ8H-DħS_F/‘.#]E$N"{%Ky0h 1h 5 ~IN$dr`He$knEK%r rb Zϼ )QD}E4wsa C]0usυ]ѻY)b`8QFJKӒhL :3D[Ac1Mx*q8J\LJb]leFyS7 V4^ B V܌O?KY*&ٗ(ߴuO*1;svVo,zۮcjFvܖxz~Fo~v*Lԛsg^zy5nhE `|ݕkݪEw}mP.7\)L1VZs21MG&gEyC.o&S8>ԏe\~3F5stF>X<,wZ0M]9 dNh="< "Xnq =0<0DL!h QB H"Avʲe~G#vt,]tW_~UE[n{m v@մݪ}vCLF针E\Z(΂{$rNGR0F8nn; 8f^ 㲱>Zo0. %AJdJS=u\@aIrk}1~ޤ:{((A ^e7_*mUwiCG{?7;iv9%'5;Y16lʒYR)P0Iɂp)X:|l8L 9MGH3Z3A } ֯7|q@w_ ~__oWinӏgQ&F^Q c&aႩFqq1~hrG F7b%|W3oS`ShN{1h@s*A5xX}R促^ ҾCxUd:nwɻqv=-Ln 6oY-m}bmCsj}L.n^7 ]bp_ol*M~<Mn;jj?rÅ ot}{ i9hDaq u=ai E -OTϱ9x޷}wՅ j 4;hTmt4'n:c]~Y9GJ=G4P0jUCըZ ˻IjZp KhryTsgHR+b.FA-&g&g "k;T5njR7jd7!ԜX6Ig2n1Oxs(㇧ݑ!ph3h J&"֓Q$u9$ }2#0UR@+ZjlXw%n (}lc]W “Ǫ/3:AR[0+,斁 nԘЋٺ>r\`LU^ CDK!:s\0zj;\GR>烗FDFP62|(LFf`G.=DZ:%b6>罥B$2FLYT^}Bkbߛ8g{SW:,S"V 7J196q)OHID.yǢ VaL If᳤ $ IxP͸EyĄŠZLΚ@K Sc&ӏb'ZRd%vfa19ovMU`kjj,z9;C\rQEBSePsK` C>SǕВ3Ƅ7)200"et1{Ot0 zRCuֳSTTqG.HFbGr\=~Xm~̸csqM^]tW[|Tw7@FѰm )0N 1F &yMf$).gK0póf@cEF3MtPgF#f&lVe<;hSRGl? TP8ua=j v(Y%]ViK 0`8P6ḱL5OsP{({h^J;%TeXKO8RP,1G(|2Fubƨ_zY 0 "")GwZ(cTQGwBe^ ,Ƅ3Q(Aɀ &=>M*{D5I NmVˈ3>`DRIQ͓ ,i&4w|8E:cqEbF<"T.1{l#\"h !( q"1`GSbH԰{Nz YD'1~u~S\ ^*M@?7rҫJ&ҖjfEd+'xBA4MIpčQfh&xXSYEm@ba6pj2+al Q'vd \G0͔ԠmGڑcP!Rǂijp@ox U:$+3f&0Z3&*R#85x+x6sfg_F.6_ۮ-g,>&N9z}/`W,)_HW3W)JO߱+wa<e >uxb):@l^lAg'mL(@5KB&:{uN22+0ݍ.r^6G9n sc}4DO[fkm#9 q_rH~n~sp[ԊZ Udazh1vm驪Uu="mB$eSו<"g7Fzp/چjICӭ[Rh6-->]ˁ~ߝ『NX }4t턗Y~xߎ?7u-X\_]t4E)*H :M^1|B R8ƍw !Zhhl/Z}1]{҇jm֭tT'T蔗_,TQ[V](zvޠ\ʛJR-#@-ͯŬ_h(;b3Sԛz ҲvES0A2d@@#&d (6ʜCiV29@Wf>F`uS{>[i=gבk_.T% [($#'a^M1)OA;^E':pc>]{ȼ2oB݃u t{kV;jih0%ILJZT5XY2/OA|V-ZL10ؤAB 2M2CQ;yRԆm'gR4>Pc >xd2x#F#$"ӝҦ_}u^)3@^ZunJ[\][y->[xӷ^[tVSe3j6`djGʆ I˚]NJ\Paa>,;dbM.%[U/ڦF&ʳyS#kNJ BT p`wQOǖqI{7ӋAX^֪(Lj4HΈ&2-hbA02md1Z/$[0x$?āۿִ YN$JZ{b=iL^Qx3RtPtAYa$d +c!& Y8Bk*vu@VL2fO9V"?}iwfw֎nibPNNj@7qKάIFZ : *o/a6Ͽ|jX^i}./#P^TόI@]cZ+qn%u< Y +˳|_oEU" QL! d\&U+r/8YҾ\'7|)eRN0|7 "Ys_8zhdOZ~,5oGk?uw{(rףϴ=5غ:C{ۛw/^QBc-1\߾]s0M~it^ ^@MwrN`ۨͷA61ɴQuN>\-zvféwJl!7uZ)tVRF ϟױN};cUF|.h:PHI2ed(1;oyw߿z#^ywW&%g"@ [5@W-o=<ԭ7e(_~Ixk=[ BQZ SlZSlZ5 XElk/2*38 l:#B>GFMNf)I2g0ϑ@VGm=C](AY<`x>I_T.M+=H,:iw1&'l9YjL,hQ!"K'ùN': ';>Cxª!t4;Hp0JktlLl+P֢ W TrCΓ%CO2 a:@),dd*dPtt fMbfEY p\u[kM8rk!3%l;g3RBR m@uu՝U'c˞q;ZYa?1[suS$xJAE63Y< ̲ᡐʩ>{ؐVFT D(LrEřO9;7$',KrF9`Ρ 7*U$DҬ Rvq65a!lL+*PUc1b<wv 3ryy 0*_M<o.Zt(-ڻc :{T,A8BA pA_ /zY!)oA8IhI5C 5o\Xju`' v#y_ˬPK`t -IU_22K$AwPX8\ u@AU<00Kycƚ8tt!JX:Ս5sO4f1k@iD ;Y$ &' 6֡>N|ӅBQ>d5iHI@bEyc`ރPi6BQ\NNf `(9/šU\tMcc3Ab%4$sQCdqwcVtHOʶ"Ld ,!Ғ,6Jg(1Z Dʂ"%f-2佬Ig- /^)6Dx۱tFΧ PâDJz~ '5+$U6\Pf@X˭7Q:ϓV2G'~XXS*x` 8zM>` FǨզ2 Mjd/{xsms'ŕemrFaoMkBM˒]DC*}>$#~6#ބCo ЪQ3kUHDH܇^Z6:O!_N'_̤xEɲtW6CZbRCG9=&:>':=ѣCDžcG:i~r:zpG4h&ͪq]2Cwg!9ht'Dl䐅:QQ2Q*k4 d@Gcxi0>~Z':.w}wkWLss^oo^MH >6^`©hdv?t~O?&w#DbY),nb! (tէlO/{S_3|Gw ޅ(DWي0}㊓a-λ^1^R娖UҹYXBE9+l|т_Ƿ7n{3a7m i/-.a*aMf&?󤫻VmxHΖZY^ck`ne<,s5YڸYHPU?{ɘZ nxtut3}z^lǤ1{l]vBfOmy?~ɫ7{̼6r;?\O[[^}x~ǦZr;&^o詷Xsˡ?=-inAadp ~u_ <7[wC>?C>?C>,LT^ Ja85z<(_SV ՠ\祒dCɢZ+=J/8Vڻ7=MZ P AjO{0 Fs!Ju} $LG'&MV~uX ؉|g<? > =]1PcD/ 1}ؑ%mh0%Wc}0F?{FrQ]e Hmx1~Jq\4N2ϲ՛0.|fRXGr3V8@]cZ qtX[Il7 (䒱jEn 2'm"rQ_VoW׉O9 `?w·bxem1^zu][dQZtǗQ.r9Gko~iu<=R-{;L۫{mT_7}bV92Vp0l}^jqGy{bL6fL盦Q!q1ݣQxiг9-nӻ*}CnuZ)tVRFĆϓiC_x* >Gr0;꠨ j'>l8_~뛓_o~8_~ߜH'ɛ~`;^k; $(ᾚ}~?>5@WS|˩5ݩװӍn a\隖~;l^tkj5pEUur qWrfzbӂء:x 6xEh|yQPn!b? ѡ86m4EvƣͬChZ"y+W! {;ͥi!:Ud39˜b6 ,lx(b#r*f ie$l@$$[<^ΎMfDtI(g}W9AzF[2eUU@v*g^&f<,$3rh"UmƨAwvhmԞtr9;yy 0*_>| V.5頓 -]He '4B(dP\݋@"[F>\'@֮Ѐdq2-Zfh Uƅ5QCΠVvRo'=aWѯ Y)-4%sK`ZV, eeK$kPY<,~d8`Yg<6Cx5]&4R*N&գV*|gչH^ں0m5"jCXd_3 *ka N%){}(P#iW2" [I˱${N~] =;1ݟ[4c^a:^v/T7=s,Kv ?Ohmȧ(AK =@x޽o1[x|=,{л2fCF!! bٱ U7%+0نC{m0n XyR_`R({|ȃUxpUU5B-@8 nwUW?7O>7 J4<\v_PoѴƇ@n_7`mYkm i4B׮f1; "Z@dQ +? *SKo!j<  pB{]B /(UDRb` E@ TyQJillYH; @2&)JQu`Jh|NJv 댜}arf0> oWj2/aCXR:D0EVQʙ>O. S7.m>9$I\` 8iR1.'a;ds,(6x+@C!)+T+BAwKCam7Qcjr#b0]#r,]ʉ5e2ԭ|J*>0e s(6"H WzVCh$F1[QSNXMg+ԻZ݁L?&:\;:^Bj4$Mjr"ЇYó U|23pE~Q٭nDTJ*Ƙl\,4yb\.*f "DW.c* uDh<eHDbl $O!ZHG-%1ұk3r}`?:u7yy;BL 6Ԁ1}Mt.aJKl<}xjv>?~"<ƨ[sA&F"ـ=L=JIf> .~)= ΧH**ZP FWPJI$F%uD]lMQ(0SC>Y"R"Wd6 'll1sLG<khTyehH޷࿅@8/z;xs;cŐ5ya}> ǃlZx<=#i?^^]NyMNr! "7Tp*5m!|_ED ~e&vXd˜;x֯W*9gMiI=(QDe<D-bCI+L&$'H#fSlr`r-:Y >t` Ryƛ4(["<5uD}qش3rv Nt!1>/6|nu+tcS.p>xvH|8.˖2 F[FvM3klۉNًR1I_+m! l&)->kDJ2 K9LڳP(99KZ qԔMV^|AmA6x3cμ!%gK"Ɍ}#i\BgHA?-^ 癠Ͽٻ6$W}J_${` rѯly~!E"%wLs]SUũCr"V`l&HD+D\h06F9H\X$Pխ/nuGnunusI >xFGvd'1YhѮDHR@pVnu[iZ5d|'MlE^n/Vw~+vqtu5'-4H%V מC[^Q@F&AUE*H B 9WqN E8HIaꈣќsHC/mJ`]]v}K{2gXW=Y6 ^59ɋV괿if [nkIەi7Sٮ]| g Cns-cJi'5}win[Xг]QA5;%bAzN%Fu3YO# 3oC.ch(Ӛ!\wa>y٩vl|{./?1z*G+LR&9eW`sՆ@m`R R IDȲ܉q eI x64XJhpp THbs,4tPVHAkWEC\Es؟#O?Y# 8?clp.T%4E4"8OTGh@ES!i;4R$%mR3OOB^ڐ8)m ԭHid&" p-OjPx5<63rxm?7ƙ`ӌ A #M9]Rx*4ӪŴ~ҷzZqdN=N\R  5x9a I|4 HX pmba-Ua|lilx ԁu>ZDd۽ :+Q[|!x;ی70coJ vsWn>]\ %>N6qNnp _Q-?ef5}Ys?jÏ!wAʡCauK==G?ŗ^^1g{:%vS׳'&OWre4Iiy9a]}]CHC>| JaǞ͒//?4Zv|; .P^go|lk-!]E刿Y m/ylc Dć<]Ӌ٩^O9h)]$Vj[C` 9=tM648o&ek?qOW0{CMe\]nQJ EжDl:_E~6ۘ}?co4NN6]7S#'ZW[.欶EEh.Q5&;{`Rp`7 yr;RWcc^g֪ıFeðMXOy'c2~I6>xu,%|G;:^e<"gW%8RZ *tJ&}'y2/2Ylx}A6 lsyóhTZCV/E,pg9 'dR1WsqӺDb%V-6Y۩|웓HmyG"qA^X:IDݝf6ɸcp=kM{ _SkQ&*l@qXtp€!z8<%b{TGm '4 &a L y~)Kz *rG5V(Ebh18o3 RQ$hbo[#T„Xw<x/dSje{KW>,S$H`J45jEG FQ-31PTQ#bdN1H2>9~4BD1.FxNѾÊmV^qS8`+ljVJN>(ÖVlZ z2_!梨F(R;n&X.d3 x+gNr' 1ߤX "U !9GrYhu)Qz^ q,ؘL$R*&X;#c9R IF =6̸9&.;݃?kسѧ#°LI@[$DkLԿ>kJ'f{h . <C6l`;8˰ɥr&LP m;Mb`"|L+]H |/{F1+p\~|G..vBwGg8H@((DlE8GmI9:BYCk=W*^܋PGʴ_4ڀFPL@dFxspSE""8AvAXkiʻtQn~$&S{LkS3PZ X'uK9 b>8żsR\ ϘH] DǙm$%LhD ܈Ed]luE,L&iꂔ k)#Ӡ,n2B)#ʜFL^E~ '_| ?Mn{сK-Hv3Pk9sft ~I,  ͯʻv5.E۴z"i<]v Z1wؙ]o}L$D2pB#CxZ"WVD"mI"*NY5p^Om8FEr 3O 8$@TFZӜOmZ ~Θ 2'-NBd֭nQn,dvJӞ9RnđR-gz vçth!v˼wqؑ!S{/aV2..6ZX`m^RWs*ss)x>6Vo8<i0xLy!hiuyFfZr]먻{;s6`6x8aw f[( r+oEFm-6]!,R%/|?Zf NrIi,RR gpE>ŜzY\YfYFYޠDz hIkm:E[bfL" 5` ά 16BjC g=e3YMvpoMY\pUp?,IL# 87<$J@R.=$o[U?l rჿr~ۯQWw;̾rptt니m,/x;x:f"!CeLyJ5wPpM)1qrR,9)rBkDp׈\#kDp5"F׈7hE5"85"F׈\#kDpeG;b$Ո\#kDpRmC8JTN||wnJG6ƃHBE2ctOvqyO7StþRsTk+*$1$%Iy%D{yLܤ QaOiD:υf"2Fi`Վs%R3aemh e ugͪ 4W果6#gf=퇳D^j|<ݣתRD h!T:JD|툑1 LP((n "1(ku ^:-v,kNau:hWUSXAu p쓲th\KF NApf)5Jr$ФDYrmt<:+*SuO?o*`HYTT#T$`T({ϭ֕oՠ ޴ۈ}.xZJ@6*duۮݾu/uMb3d}{.ǜ a8V78_睂,^un>yЕP./i=HyVr:Y*QQ˝1WZղ J7cg}s1}-N磩 AJS mhp`2Qf$&O zr+W,j+}$=uH!3qj%U< E>G VGյ_oE9D$Bhɸ(w†'d߬ҡ\g}!&)yRN"}_Yjxf~w=g72vJx+dqjCWvyk;^B+(,2^OnӪdټmO./ޭ>xy}N i冘tVNo-V^s0~}^ Fru$mt0}t[,XS{3XgD/ƜN/v\{G{MnuڕsuQZFWTO<23=V;jGyMg˓8={՛_~x?7?|xE@/&/_D/ ೡuyCN7q ϗ0K.'{M.@|ZEJVR,M+L9Pt1.M5 Yj z寄:^xm{}7 Nҏ'wn'ɜBa,#iTd,I'OzP '+J< G;k;ʥ쬒rཋ1 '<`lԨ0 Վb Y:Fݸd_gP;xmٹMMX5{ؐVFQLr ؐ46{*pJ:dp( Pf)D AԿA&b<,$3riN@рPmx-ƠG` ŨP wEVOQ`<vHxNV' mk0bhCƅ5QPV3/NF;iI+f d>6*$^/2j0ڃK$AWPX C,1N gdxŰul{!1+ )_|Vz\7ec)} r $(J!'^)t485;;׿2+нOgjfxcQI!NDcG3^QB>it#z 1٠XkRD@\.Pg$Acc{\jP7I7̒4eE2H6) %t",UiiFzy0qßTuÖHw#4uB£-P}b\gTq07̲u3է^U%?ݙGtf#`WUs-lR:;ڈaSS{L=0U,YfeI#/"JY `U,!0\r9[S O^bΉTb BuPDJT%P4l=6#A>m );  &*ʻ!)JPNhHglyap{6!w!^1%Φ4 דW=+,-<Ϧ^zl҇ .z/'Xt\?z~zyM.hBcU (D#}C=AQq}l'0"eY{reŪ( sbf&aO~!kJrپw~p>I&k{0} 7ɘ{2k*z"s5ՈX!:+Q:u׌Z ӜtqCˋ^]-SӼ3ꪄC O'8=}-{i ^c:_uݽ޼Gb\\ӪWBʒww}oudD$mQ~; լuW6o,GJ#!uǭettis͝w.`5| +z՝p>o}ugyVmI^WxЭ겶{;I7u}cwj5]nai6ܾmf=MZDȞ3 l?-MRzmNVS :)6Q?:Ga>1pါyRHk) ؒBu\Q)IF51#1&=-dhS3ڮd:&+P S4DQe dS(|)BgP,0j{%k ޓ%`̦xůXu0z& egb$rNXPG~!d1h'e10/d,6l6{{v֓B h!T5Ƞv,*2S%Ŝ fޕ &"SQXX\*HBN\NVZ4:ϺflF'fafgЅ7vs..+2ޒqyqO"^,OANNN_?+kl Ba⽩V&_I>V#j岑sQB[yU4OC5B@AUʺh51frF[b.":Xc7#~MYfb&W+8TkZۍZ{D8`PF,zCzF`+l KpMBF&D.Jݶd1 wLlEGHB-\+ċZD$٨.M،aoS>E#6C5oG3_.%\6Q:@G2g2SI[-/|fxd(T|n^ af8$JNCzEV:}lIv*8#Pl٧9k2?jbXG. ٌKՋX/GcJSr`@4l)cc@Kp^|8}،;ԇ C80dTXg M9 (yc-pjƪF5uLZ x]l:",! Vr u %PJ;PG JVk6  vFY@AH|Jv,j+wlUʗJ._Q4 q-\_ϧ13HICR$QʶgztFQ&'3!`s.]E@"NڢDT%PJ Sk y/*gdһNٍ!qQ2%C-vu>q06Vj8\  & AL.!rPpQ']:H0jtHAH[GuV1ESHEE ׉`^G3{Ph(9kXayi8MމueYd!&k9К VpF-)FXB⼇lp}_Poo<mOOWXy\H T}:"/pMT深ۇ߽ };6O>{Y0zg=&{wy²Ai2|?_6t mJzqLrWSz SZ[L ?@z[Qޥ 稩ĪNɭ;Mט~nW`]:%S- `-;`!x#:TMPB<8 L:r$MD/5VeR@^brPJp?,S9ěڢeV11Υǣi,&Ξgޅz:|rLqFvxWlt q&MJZm83)x~|y6<a/ޢ uk@LI(v$2 @].ӉN\*9﬊fG]™hep#fRtTkdPܡR@Jgdgߋ=Thګկa`[z>EW y klaRB]`Y-.DފNj_{YMWyao1;k.Ant'v#} 3{+4Rr8":<@!9Hrm+tLX9[ZD,|Mg?/y!裈6& ^^8DZX -[?-:\m M-s ũV?>>:3x QLgILNDdtH 24cx}au}xX}qԄq ¼ϭbuw^ ޻fgY'p"8&N;%Ѷᰈ7 4X0;]T 110K}h$*gJ::T34ԅҎ^1qv+ u]\kU_2gXÔiGhfv^R_A-AMU[|W-6c4 ,Cgj "pZ[bJh ~SϚ%8=lE=9'_Y W*oL/)̥F@g92cH)fQ:&<K8ݱY۞0oY_nB>^<:HƝ XsnXM1G)#rOb\`pn3%AK! ".BG'HeQ!W$7 pCz=G$(<9˷9m]_\ {k]>!| @$2O&B"A1G :)RMVQ)öC)3-a0=g<ߚ62BAH#V42C(*,r-O{?i<6hz͵d-I(t2[2gxV e%|d3\kWVo>oiՉVz_ۦ{^Nh.vMQr.5H'ύRٍ[4]_%tqc6]F9?T|)x }4?ḋTU R]'bx>֓*AUܧd*]lǻ9Ozfn~d_(tDADKq-L.*A}12/|s2דqH9);lto[y@]'',uA]{GmͰ 짼1oi Xf'}=ŃNـʚK(b4hEϽQ}>9A8 }J()`R".b C~J(?*MĺjqluΗn5۵KhYl]Qs&f ZfԖO.켼ZY\VR>լ WfYQwUMgԢ/&lr?EWyGSC3) -Nդ 1|د!R*Aԋ"=bBfcf>e5M=(iu2WrhZQ. |pR,aAF'G=j VKQc2#l/ngqQ£f,'bO;c7j5MWBDjιh:;.A`B#+ D5|om|/Nz>Q>.l l-1?1I}yoP>ql7hʞ(K? KY?AURypՂʂaD4BtJ'WX}.Iayk%"J T;oN:.)֫fb+Vc1}R!$g j=.@1.C"RFR! YlѡcGcr2ɀ^B׾cb5ҳa݇`аm [y뗷#DT.dHsF64ԧyMӠ 3«̨Rx e 4T{aC!kHB0`j(}$0@YTaMD5pϘPiDe ƽDD]:Fb6Fryoiu<u A3wVz=Saj1q U(-ek8zi'FX+n4 a5@p(W lU.N[Bat&A;`Ww@:} n& &?| Zޙ̖bR4BGىMN"%Yo22(.14rW.cZ-CE/OԴ Ǔ>+9pbt2s:LiO͗6=n9i8:vnNo}cvN[Bn%Sݚ݇?T=6Kܺn[ُ7n;wi\.jy絖l2r:7'Kvy4mIޘ;nbaByb޲M_6w;<\ACnx=Cȭ9ԶAE:D^&5#mZ.=3dzg@N#dT%Skyte KЃpggFm'f(p^FPJ-\ljә̧4XIjt=<e1GB.Wm>vKBR "`A@Uhb1D `R&=9<.\$u1/{% X'l2?KBӁ(<*f\rEy ^kbBaE-&nEg@=CXgN@UlJ Spc  =<*4ʻ]MA-I&Hd.u\ -"2X"!Cg[2*@dRe $Nrd=޵O!G@$! Oè~Y:S!U"i%QMǀm3驪ջ44)j%r3)ښ9UjCEso 28;H4\݋?;p|78~5CD$`QV+/ m()}I,ĄLBVZ{ )ڔ 0.L. p !eQ\11OEkW}M{Eb$˺bpK,XI͹rHp5st&pUaV ٌ.!CdE HA$lIGcV2A>lE OE#V}5m{_&FM:zOm m&`#٦yrƴ]ֆh@g|BIs# ,iFJkyX#ޝ'z՞ttJj\^tbwq<4:rB2 N3'Hj!@)xz zTa5S VW=S*w0Jsܕg?T#7]=1f8cՐ1= 83&݀OcXV^xf怲OukO%>8$Rdl@.IFLF %9RL**  uuRt)0E!ʛlQuLT X;km{C%#j;g'1ʎ˳ق $Mv󎌶d9i"fZUd۱$AL :DeL'qupo=:9$MNOg 1B%PjgdW#a ފހVhkث>P+=eM1W+z,K'9XaiR^\]68uI#u^#o(t#ƣ '\s"+&^fs:M&BBr^ $LVIˀoZYOHdD:qh*Z蔰֞3/;i7S̫GѩuJ}*&kՐ~5}ϑGi2r`&гĂ-y^Zv] fw#o+X`\.h?~_3|CD[l@1?_MƷ(Ō?<9jgѯS_!aEicg3tU2O&>݀ClѴ/kɝ JQSJJJk3!t.],edJHf,9GA[JW.!0mSSYmk˷bir V2Hȁl`(4F1 )Ē0w@P8`A~ekP.Rk9B9-s1 @LV"|DD)F()H\XˢRa)y_yOU^dȳeexAJBeE g>"9F!Kj4Ol+^1bB@!h4w >>0ͰL%#C'(ډdTW7`u.vJH>?bJc\Ghtó)*7xm#+fO.XLIP"~JX(ْܑ43;R˛Ɣ<\0]0B|I鶤Gm6N ן)\C_\۲7KZg;]QSv,34ݎ 5\ϣͤu7T['O6vq[@,_Ռ9zͨlmaYD᲼Exx|w%JM'[3".2.8c.Ll3hj ~78_gCЉI֕LsI|06+8PscԌsw*kH6]좿6ۜ]yLo;&8=?|BNBbћgl8SkSlnH4G2es#s5wEss5i\?pO`m AL|bRXzM'+c:c;Y!~q>a{'N]ʊI_}4hcʤ/2bI -yjD͙a)d6.:.T)@V!(5k}gIOptXQzYHƒB.R٤3R~C` i݀'3o;^Tq몷Y'}Ok_&JMK9is;P#o,ӶR`C' } +V'HAަJgge}Ԛ6sb Vk3FFĻG.3h;҉Hi-6rTu['_iMŧGtG(>1k=BgZICID'٠e\ NVx/UwӴ)s|&k(vu6|7/d88.{9bFY4RsGT 7-8ToV: 8G3p,&EHц/>HN&{&'c3Ћ=t_c.1wP{}[M a~WAwʁeh~P+ًzKlkpjVCTi|ru!2kII#&@1jdt5:f{wfdFIK7ޭ`?{yФhxrGcYNhβ F{D풇HKt9z\tg4} U SB(֑R2vKJ4A2tKh"PkVGv]jӓ-3}fgB+?CEA$(;vˡzI{-"$kϳܨrX"{J7y B)83iUɈ 2 b. lI˘&3'FFk &zj* H5r* n_FS2מk-A >QZ**)0h4ޔ)#с5hKlza9X20b7/4]#B& d$V(2n}gbMP$Zÿ*26XJO5٫gݮ}~Oe7ַ݆prqC :>stoNzMY# ӍØn9`pne^os>`s7Ѝhhu8hf#BZx]E.et95/U GM$ʆ/o II]s ;Ԥ!G-oƈo)ԃFJӠ;5hVכ3V_˼P \Gg01 uq: \0't a K@ hc >ޮI^9;8>S) ̮0mo¼ #Ɉ_so^H_=9;L.ƕR9G^: 5N܀1AiiטBR9"}*{]r(zN?P<%D t73V&&DS\MwEw-mI{r?%8d&ĻX`ODHɉ>D1Љ-q9]]]zxŪFOer&6 -& j KI:))eU0UCt!T;L7>Qc+#NA]ՇAg HXOu.Mg>q(KSb91Eǁs ᆮje=(j+Q=i]DYjq2UԍT(*$9O)І})'gLanmM~ 3-WK _m+&[emI*.ʛ2mݬ[FjAaVk^GgDBL*dBFM>ZNַTǽoOݵ֐ٲR hJf0{Y։i^ hWa> jm dAXjT"[cs- *e̖KjtH?브NsJCށY;Zx19΋%Cߔ/-U}VjXO;U5j0? a[{å՝nͥB#Wr7]}pvcZ3q[I|^N9A9꧷dQ$DѸ$/7ƪ5=2IGrGM;^k4Ga4-t+TO.Gg3$K.V6˫>=?9RqutOϏiVr5ڬ|[2;R{=2eѢy[^nŝݍ/./<grMIu2Ǔ7o|ۍ.0]3L}=9=uaͦ"2=l8ظ<>'' Nm^xA;V+%K-tul >ߞR{JSy GtX{yLmtq{&go!_xW_<ū#?>?zkx?%;=$(LO"@A^d 5[u=]quMt~=LwM[ҋ J44+bEm?-ǐ 5\,B9=~{_Rܙ q? E{ 581ҭ5A$uIhU/)jtOQP+Sc텓5(<}[/8˸ӞÎ' BuNyR MշMXs%SKE:n^g:;3L<{2\zsVSw>OQؔ+=E1QJ`S\M] A~w6C>Mu̵2NqFmjY7:ᢣ nQRa+ ]vf(E±BR:Fl E iIpn(rns͆R02Ld2% FTѤLb*{nNn5}A%%ZHJU.Bgx(F$/ iebEH(1'4w|x{={j4x匣ڕtEyRULhJ^D5l j@ L+넪X`)j< 8omԁz69[ڹ|DPq{gb+Qo$1먳Q&[wɧtA8 #&ܹ6zC/Ez ;߹'Iô!+oQ8Ve- cҀ^yUھqaM`P+}HR"4%K[C.!jYYIDʐW.E.@={1ةel"mt FӐTMn[2ZΖY@w.B"jmífQƘ(`"+ǴFnWf>8]$uDXjҰWAAUO'WRͶc8Xq:={+1W;aXS8~'ӛ&xKH@DK;Y%$ML61[&؈dboѓNG00mtIʂ @,7N=uZi6>l\F~qms#x,QJBb(].m6_?Oi$n%s:86fԄnnK]Fųqpy70hqpuwtw/fBߍE.;G4B6M'%=dMݤ[Pړ7ϞpvFU5&-򘝡qp`$ֈϮ~WU{`bo0I\\/nM$ 7jQŭNY)6lf3{eS yHcJD5*&R}ݖ~uXǂY0./q; ܖ WG_$^A|wTw/|)cǬwN#KC6β2Y~4 oPxУ f۸4f`6m`--#/2>pNzY%ȹ,{!~eK9άݎ9Niw.O3sr'yJ.-㛠{.vk{}G-OhQB)'*tBw-d9/<}dQ5+qu&=Dda/*`~R8f2q1+#7} jYOKJkFr~V.XA M7tuagxr(e \=.̗zKp'DzZm`-_j[N}ok,U@7kjR,k}5er|F KM녁2⸜@2ОX)J[+6N_hmM.,!qKFdV&ŵ,%A6J҉+YC :c$(p%)ȌUZ2%d<۸7J&ma)!;A)&pp N?JAu$B(fHb~#'i/&$ ̢DXioe UI hn3,Hؐ{.`BVR; |%GYT"8ˈ@4.Z-jUD7d`)$\oxHLv ( ,Y=V\vV~" *HŒ Pdhsc%:02ˣcmVm}B;:5h %hC1%WΎTg1y*.FT4R{WK0NiX9E{{ @*AO޸\s(+DPF,:7(*@Jɫ޵6rd"S_Cb8`׈헍 3HVf<U"R"%RjY?lv:TtvYQ– ̩c-%)( d;kXYX;v'SaA[w5f`#m6F80=\ZA$ M B.e}t|QV]R"Hʀv!4JpAhv)0ְcZ UjxrI'#6aUmA13kFV(-5*8tHi29E h#:: gh ?Aj!wGzc)EFWCAasWuCϢb! ӉTIXs7]AGY "kTRpdf)4ͫFjWrVYߋYu4-INE">ON%TkƐ% &nb1;״$V%< 覫2o&x ѹa0 }˿{{FŬ7P$^kpjhq=(oFjlh3wm  <%"`rHѼU.H7<,dsk$Vjr) s A(<"!U@%NOp'r5a5a^Ku&ix lBV bʅs yrplMiAyMieȅ PҢCHbQyTKF|RTRXx .,s ڰ(RVGϊ90`ci ڎF5h*\#|r\gR,JLTH#T5%i\4ܵ#k$Gv \-C>ƛZAKrGQuP8kL.#]X@a33![V2#h R/j@_DbiF8|k,ٶ&&3RO࣐X2 8c&󳁗 .2 ks !$|U7ۖKo"ZcdnVM6XEH"| ]R,9tP`- Q{tiY~vW$4Xx3{ P GyW+q\'7q2Zj-ى\]λVmwb7'L?#9=וW|`o}sɆβj(wS;0gL%h˲Rw}Z>Vr6!aWwxCd<;0݆%<%7Ļhpۗ7)ɶY[ٶ{^~^{W">&UۭCZ Z WxpN+̤WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\-gc0x\MS`KS\mJSZcBSJ^ 3jv-Ey'pB+! W=FGIzA96s pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"p<+DGDxWp)$HG+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"p+Hp=Cp WR$\!3WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\jP]u*z[,a~Npȵ2T-Ž5$y6'SXmkŤ$68{2$˧oڥ߽ ʣn n\uf'8L'חtm1]]^=; 7'Eq3FvyT3NEc^]N?4so7ΐ8uLRQ^{pov $ɺNuhq_%?i×7O23;_͟]vkŢ*ta6vzG:=.qW߷Lyڝ)c/Tlҟn̾ZSKc ޴Y);X)Ԅ_uvU.Gc|o>+NUmGS̿Ջoggge0Ya<קe|rZ2 @%ނ\d}=#P=8ua<4^yVdMEz"%Mg D%Wt ,6bq\R0ӿO&)$ Pl٨ITBJ7; S0TOgk 1K0ivkso;)h?+TteP)oQoov +Z"g>k RMO3JJgwHP=K]"t3aW:z rxVrޕQ7i6x>0F1ck>i*:]3YA6"s Wm8 b%tނX^{ Q\ޠ{ܧ{2.%\Z,rs-tm݋w{ʐ_z(̓ oknX%a*KF5._G MܙlOSw޾۽J`<ݬob JsPZR~s3W*lo Noi">!}w5ZbjCbq}*( zDž]TE*icdH5P%&KbPm-: !^=8c5%xt6cw.7,;#9J+ݢAet^_-nm GGܬ5w# |U dk=|RvH"g+,r2n.;vԏ仢!䳏o3HU$s(YcԡL=N@GsAi+|MeN3_2YyEFSq R6(u.t џs@78C_ p[z'SėQ[ \t:Ĕ..XZO14N,~#Zr÷n(lPevYV.^7<*Ga#lV`F.Қ O^,VO,w0]yLvv\\˚V[5k![ ȃuZ!*W2øE!Bhgv`fNݧ%li0.Dy/̬툚*M. 󢶳,@z㪏:F i;'W]dpZY2eZ_8OJhAg" 089AǴ\]_=`_wRqY0ѡu":xNMM!<`Kϯ:ӐZ:׸z=m58o.K%7a{yCfwGl]" eQ@&*6L27xI_4}_nbR"QEvIE)im锊+Ψe8EJ7}M Fsob3ƵZYY,ȫIjʡj79 a9C)8ZOh 3ܵVbd,XT)XZJ`ZL@-V  @؎Y岤M4慌uTa:X;ԚvQiڎ]An Z)x.&+ N e~uKorw -jaqgVu};=&~Ku"-uU/JVzQ[dϳ#kl ii)Hk] v9sl>'z~; [7ƻRwq9±gi>q9LJDQ%%UmK%L(]fm-0uPkfm6KzcǤth`MLw1Je„(4mՋ(ty!Zdddr)<o$Gί\EeٚZ R sFso! =obpץdRNogCtޘ)kmPhB-S)1rSUQ|5KJbT)?RlzIB4||NA XSާ~q6[t}oޮ<+[.pQrѫ/feL^'ķX'<ԏW ]JtI),'k d4WE<L'xqq'^P7%]/tp;rv&lKxnn}oNDs|?>g3ybugwUo&ny<ЦSdgl)2Ue8viٜNEop^9іד)9fاK}'sI Uez W;t~۟L`&gx[L"Z5"(hMf`b{/ɕj{{?W 7} a^-@d߉Atuꀡ"c"?QBe"Y|3o'B6:,ܰ N#(w=3N1Vmii\Du?Fh=8Oo+f-n4(ՓBzrWbϖ2x{wj4Ԏȕ&%ِc%ztYP"w6V*lZ믎m4)lDKJLԺzҺ_=MhUq;gpӴXwkznwzތڹww|Ț'|-xy,8zDZji.16/G}}> WD/v,mN:x^~9sas92q*p9y)8ot9W pzJ~!Z3]R,h)BƤtUPXT1x ,H\ 1}ASƈ?|pZ[mYGS*K0ŕu{Ԇ R|z)i͐m urC*_QD'3]b9l!{@;%p-x$O.h8ya&̀} aLmH!4W'̆Gu4Jn ́9k?s%uHz7)on/[钕0Y|_yk&aYyuxǑIƏ0&)t QX2A`Ua1G}d1g_"$q [3LPs"G ۡ7H|3A/[6F9j|4q> Wq$C3ȴ>o:/O9ܝZ]S)1^4xJ#[ǫw].Ocz;|a 5'/\R!J3D6:UVUWq?k]뫡s>Hw=AKt> %KI"+r>H 2FscWFY긌,ǂ? h08/_%-f=}kϥwsn]bV&-dP>[IUR:Qr!Z\Z/B1±<U;  J. $ y31 bV |`9JN6/ ε %cr,+<`Sve}')2D˂5rE.;͑(ɧ2Sw{dLQY F7 3^>sуRN%SD VHhRP##%Efl'x<:(z:IQH2* t Kb[9 dzL*&uIOI;%z,][̤zqfQ%rF Rĵ%︾5?B#bYB= Gԣ@Qۚ 'pp"[KmF)_ ,dB%IhE:HBʌĔt4cIY.1 \PN`C'/Ɵ pgfqx`z[q$O*Zt7ۙt1Ö:a>ǫٷJ8Nֺ S0WW<뽛'~ѫR0-іE^Dl(hԙՠlh {%m#PYLIoxT9]휐(Agr9tӼW~enXDl(Q%" Y䨄1 -3 9bdND;y;&ّ/7seYRZ,xfKF'@FYB\֒%eEșD ?eRg`nGEѵ6TnǗ{g˸eg}OQ*I]L'RraXе{M?\MrJUWV~P@[J{噌 :?>BOF!"ONA%Sga2X/v^lI|L,ԓ4Iv*\ "l?qh_סjȆytV"iIUK9fLstx2IJ`ޔePg'wo$:„*X!t$^Le.̨QKqz1hje-K%*=gܪTP,V'+b(B)fQVb2M e10ߪd4f#s:& K.H`*-d,g&*XÍ k2 %ckPۤ5Қ,lmet됅w ί 2n0s7)9p{9-zmyÙ !Uf5󕄛Z>L3 k>&}X ^?_$1ev/HqpZUfpܥt/蹃z_Yj-aQ{?o}qJOD5voDvСj4U*f[LxS4|̽r` 1sK1I#O{֧K~~_AF;k nΛ\f-u7]koS@qL'o?c)*-@ZZT4'I-T-je$t5 )IcK\4jcq7"q ^aT3f g}v||ttg|lk9:_eWbĆܣ=DRwÄN&=y+J%A)kv2|s^t熏LM- Iw}NdDlPIG[ :H͚Z(KX$9 *OځoP?#gϋ!/|!=Bĥ>1MX(U le 'o"Շ"ij4%)INS@` &_n[}T m˓VwQcv[K-IlQ޴qрլw_[ݛ4[q&X]~so?κ)jj $΍C*FҪE+u<&¥"F!br$:l,acޟ Au_1u[twkq=oq%+ߒK#qӣy]ߟ;tw5}خuÑa{O|f]>lu_=Ą||ʛ _k;>ʃΝ}X_|N61RAøzy\r؎ 2n.n,xwaƏ|wf;8븽UL~u.deXg;/wTu͕WkyS>^j|s ǽu|j7εیζd·/GfcԎ6H?j\^HR7>O)LIk^#h,dݴ9<Aεt>G6'Vq=\.2 ے3VS[rE&wE.gz v"0)yRsr-c|`ROWccvzCΝԗ)>07=Ǿv|B靔7$zzN#ǝl.zl>]δ[nuҊX~z~n#٠tK[ e:\<=ir'gt| p投ޭ3_dΎ`X`lB8{iP/߲]>gn9L/#OuvJqV0 {^~~~~ @~~K uǃ>XtɛGa;q?@? m\x;-ѻQ^iW8n9+ 8R^?1=e!Ok~^rky+3Rް(|9BμɩcQl*6DQ_[Նb.Ÿ:IӍH@$]]:]oXۇ3qIj1Pس@nwUm3[55`ΐ"wnuX$5 <9eQرhՕ8ت.-+s 1HrǣF@Қ umjb|,QèIZY9HvaVZLm)y"oRR̥ft6:7lg )nB%m`P-unK7PØݹ`KHf :TUyͬjT pF[(3%rwyVgl@}=mp/cxr*8u6玥1 $ĘKBêk=J/9p1X~gbC=F_42f#pìN&=[=~NhilD-i/ (?3d!LD"RHn7gEêrm]-i%EmjS4=8lVglY).(y Ԓ$^RXrWCH GL$ Ibs Q\ &SiVXؓ //G2U4ڂ9$g ͈&c]5\U%KuVX Xtf!8DGmfAԆR uF2*@焍L!<’dgb{u@Ql  4sR #ȎD QRʭ2PV91YX<<{[:8Vjö)<', qYK4kRXÂDZ{fi:d`bGp,r6&Z _*`Wla;!J̀j@o]\?1h aA .Q4Jaa#aQ4vLL a)oP#)%&:cT(XUr|60J ܛj,Q"]G50 AY'M"<w/:;W2*dbg]{ƸE]%No[AG@ψ`[ɌkL[[f!"Cq@[RX,@.$@H4xa\l"񡀵ckCP21Ԡ˕ti4v:%'7EJMLd-|B%Vxc;l;xa[f欂ɨv_ϸtjՏ{Y.x.`IXQ#H[ _FPYǮF( ]P{TV` 0 8Ā`d^X/ZTXP @Z (  H&bZ%D^W`e)fGAaFK’`cD'ȢNYU#(ָ #2`Gix:adQ$ 8Հi%;6@md-Z!(TYZXo7ۦ@*O]D,(KƮeXv7Xk0":18;K+ϏzX"P))wK}IʱcH#\ϾsD e@ p.%8ܖ$`}QcvP ā ex ;', /Y+{w(yBtX 0ؔ9@c2/,,$f5ŌJ:jJPp&Q`WZua<\\Řm7j@36 =d2%UO$Z8_vv.:60k$,\YmgPR譚X>+,e"ĿEn$' y/>ZT!#Ah֐;[#]Na]kZ==?:l[3QÃ`;wv^7j|iI6\#(n5%3,ViƳVɺ^\-BVWFlW7(3:0#uu8p` -ṎmC\A7,p3HEQhc/T@=@m!%, 249)az ơ0B^owȊD(i5-'7,qRj#iO(c$)7bAFR.QozYt k~8_1ٕJXdZ 7l%x_`>2Qjg_KL $ۏ'J]bCޭrUfv}~7$NĎ J9vOjϟXww\O6`+xyϋa-&|%X<У׼̴]Z }Co O׵m {-#BΪqa79~y7-$Brԉ$܋>:c9mL!_`{萂5CsM5Awi ]>+v d"Wj`]sqۈ vF8G6Sͮ/v0KBlפ$i5 ֶirgji֤%)$q.El3f`?lF:v:\~*͜ nE_y=0?qv;+p~1N>FI>}ۋ^) YT^•ޏxޟRiLXǮW% :yռ~py{C+q(B.82H,&$.H+r?in5Qc3yzH*q&u:T\>Am"'^U:u2XY#89f|  % ur>I̝hwld5Uەt3)q`[[ɩKNnC=/N*U5篪GXmCSzG3("Vq-ْZ ՟'7T+FCL$ f7?XPOm<BZ.6 [ Vh5KFnECk{6p;}TnÕ5/0nTk!6cL`?!5qaѣVnqcklR<$6iasYIO=Jj c sͪX<,1-qp=6a\oI^H 34N\hRX"զ3G:ݝxޔJ+Ipiz-c:g^ڤ!z fkf8ZSNTITX:=*$ť>͸'>ZÝcLU ̻bX:i5H!8SI5NNNh.9ׂ6PBvLD1-pZoBMhOW`JKqVS\*BFZEDpqe-cۨ`{lE nFc|“ kI Kӗmt]Lgc"ɭZ 8mLI"j%9{~04vR@ɛx˳9xuNIC^p;29o{XW \ᛋdqA]5?I\KR[Yɒ27IN`,Hm,+J3n Mf zgZ5Ea BMR`.W&)klrKU%&pp(}X+5G;ۃٻL4sc<S2D_7PA48[qr˘݃d&z*=q̓9_t`rtwwhc#S!LH_~I{d ٤M_r>h~r5Mg0%9"(XVͮig܏.tW,[sALQg;'('tyRi Kh,(D""3)MdChn~'#[Vg=Ѭ1XV%׿_|Ȃ%L55p-l4LOi%xfcAgCDu' "ӕEz.ޚ.f3f9HcYlk2 Πnv13 4HZFm l3` n˨(lܝ{dI 7$JBE"%!;i]gtbA$J#jo%\]0WZR:;moƠc>z5II$هҝyNQYFLA*p/_\|sJGWznWˏefɄi^֨".҄%]mϷs+yź*(`3FaVs,7w2t SY#ݕ>hIAug *<6PZO/c].2D/@A8043Ioʺ斨]8C)? gSȜiwb/N1O4=ّ(9.p9TQF!k!aQB!\zk7Le@ ԕ* #egb k_e7ITgOTМ0?jK*ĩe,MLZyO2۸Ԏ-y ծ9AH@ XW20OEMG퐗 XiShRY0ow$NC0d4&e<rt9m@(}ܘ9@c|=b\yA@׾t)L3V-UjO#"ܙ<0HAvKa(NKAR.K1^clwCĈŬa0]i$&ŵmY=0s4֔:IamƭNFAR#vV 'D:dJW3 =5M`C:030exH2^_a*DNʶ; SSwoan6!Ela/pE2Ky8: m)#\ST<-5E+n4.aU@|cjܱsϚ؆}ߺk,*2# K +s 5սB4*Fg25{#I۫A**>gU^F+'Zغ%PwJ>(0qZ1Λwvo(fknrʷ{+ J.7(+ϝ C7RCZ n6־Y:^sH$]l3gd:RΔ* <5k)Ӥ`E O0F& `wX/{QShf:9]8eAkl"̹j1 @vf{/.;sIz6&pG#HO5ouw0ݴd=L˨I$).VT={HףF fq<[IHH&L/9Mek"-t"&'og7\*~[7b%5"OcK8"g".,!!N^ͰM8j^Gٴ_q?'A]'X1Y*x3RXQ'2’R[h3gޏUQU?=$!('v]u2>qsk}>o|%HVJcj'Y1=$iozntXLA,0޺a;pbyQ%ߵ*x9FU1F UO7B5ri/Ppݟ9sq1 +L+joUt} g6E,z? `F+9bl$oq[L.F qC"8˘/~_Fgė*&KN}7V pW!,'[!J]1{_DtJP+bp׃NZON),#z5U_GXhYdh 6ay, Mgm+zY1VyO1JvY@fHjlBFDlD 'ٛ6ﳓHxҫLe6.\xl_qKuX|9/|?}WkfbnRaG|!MJOIfo!cr7oM:$dl&ݡ\kS呪 d܏,y_";pE~Cyk-n}^to9)5VJ O^6+ 娉}2eǁzwl_/;NCJ>|?Yܐl$KU2($W8oNbd4Yr_Anw[JhPƿB}ܾC]P?yf/Ij]7OQt qY\H)#+!ի tm^ֶNU.y>\Zh OjAQBԂ$SZ)K=*,`ĭÈCt(npXnDt?Nm^K8NmMN饜j29ղM% /)/\' ae7%25&:qfjʋ:qa-."95s$;QσzZscBjB x2k?$ɀY`;X9i\wLzD9paC_@T3Wk?$h1|m?J, \d:[6lK,0i3d:[uk3p߷hd4S:o,{;ue'I1)Air:iW2^2Xrg%HN[LRLSH"F$״ܔi¤P.k9F^ޢ-'e>K/aĦZtM#>/1\c( t^?'D_׉&BӞDZiٶ p{NF\/.xRgd= _FʦJ^ɀk>^Ѝci_p\+9<XK=3oM<YŅ`- p 9 ]IQIJ~5sʟ5fW-eԼ1Z̼NxIF{t@LcE;]CoR"-f,FވN`9ۺabEr2,I!.**#›|dU*'naXS`u}ôW4" { ]@MmI<!?2#X%ܿL)e_T-&yqx6 ѥi4?W |bnH"FẔT[cSyrRT*UJAc0/^67'Fh:Sho.(hk7 O! ^̴)c(NoiHHO 2 O5^=wE[ ixS~f8KЎLwQD0Զ T+G-y 1<.=FP GFam:+`M8#'eeOsgN`*Y̲|a[̍' /}@ ƣucc rmtf8i!A*d GѮ(4*Ifoc T|8'-v+2cDJţqe0S1sAq4HHhmTnшލ$Fbt:Bʕ7RLF/ .kOjU |-]H1] hٰZs$M8/1@ʨ '8Ԇ4Bf*GqFIkeAUFF<ٜb:LՁK\тٱ66@ FYMfډ5K]0sALa :pC1M%*w`"h8c N=]4'Vlņ[*p2x ]؄qMpەZ"O0}-\3dCv!cd bd߯C/0g&^2@@G<8' z>m%TB2:sr X8LM+ܙ ayJ84!\QlAбaPq0a9Rnaז<Qr?ǂ@J tԿ<$?ycwkW`+  ot6xlޜD=QPgWq|S{eK*6ip"p75GR!7؇a/J.Y:Yz7>e#3'h6Z~ Tp ĐVj!@@MEnZ401'۱h9Hƅ2d) m jY*,t8AE>}N}^P$`.}'!.p,fZ(ky&ywcM,~@:I̖sZ&VqM( M2:<@ZW;>E$8.9H hu;}`ax׽i82j)^ [G"y1HF@"U\)1x|wc_@J\>c49 kYtZİl7^) d)eLkzpzä.D0GN6ٻ F3SմjjxgB2+*$LQ4g¸ Q8 H1Jumro|sBRS-[Nl:\^z*w9jO|a 6;mҒu>,֟LY=KSJS.^)Y2V傄,ey*䆯0#)w6C%1XI O  eS^!ݭ7QZ5ICw_c^b jD4W4k?]h[a4ك^{챘q7DOԩC(n(8` (Α*Њ6Z^$9{hVrLCiClM{]6HlQ_=eo@KUb|ZH̉Olm%DD$fP}̧p=] i ѦZf=Xk6 F\,EbI_f3+/ba{-џ[mÿs{~)'?ɰ{#ml  {}n?_~Y¾<]JFx4\2iI"+q 0x,E57[mvE"B6R)f K,!.@&1RV ^&2'hJTfRKp*iJ!3qy8Щkӱy57與iҼWX;Fŷ.IVTAy|"2Z&FTWH("^ufm֪YSS߻;А`Ռ00 W6J̊ ]1:)<'6c)KPLK5eCJSH<։LL#ӑ)m \Da"3`W)9)h.ۑ˽Uźx؉[s)Vw]'#,#-,aF4JR#O1'%+}WfcwBrz:"DwP|mfW.- 瑖| Ol ʓYJ\֯u򒘈os;Iv) ʸ䈓,CZs)dlo;ȡ֥ Ry5rhD1ɑ &xر߬s7K\[N*o  @Tay3 UFʺAU1&AmݥnDaF ,y(l!킕P푹5\Ml:F¦ RX2%xOm(ڶB6 ֒|So,1\],Ά5Y::U?n{APEZhjrA1\Bv~w:V {Q[F mx|ㄊN.-lڒ ~M=7rZ].?A\pvMAb3n y΂{yu2[Td7qلƼ;$8U_J8+d x|IK [ҰnP *-[{P:Sp+xl>.[ʂjw`\\u&aDup@i܍іnB*p{ Q\ݼF>cJ:Pӆl k&x̾"I&M :#fb|K‘o*%ϝs-Q#"ZmΌ aʋv_~VORDແIL7 >1^RY6 ~Iܵ`0Ԡ\e,>#`?Y9I2h5VkS!\[D,A. .7$7P .PjH4/eIl׷=8hB-a:L r"oNo2'b{c!ǛH)q8}N;[Qa">Eey6YjJ4Ku;uӽKD4O "J]J bb6<%j`c]K 4RLb JJ1w$!0n/Fr=$aW WC>Faw)j ZjU]_+=[5^DO\+k,;z?RB?.($&Y @TYxAxNrPgV"QU烹^ WR8`opNd4*#7)qU6+#׿z0dsP9|#$Q4j7VqӎH?;X.)}ޯFsPgJ1k1+ݚlNoR)nذuAU_nv`].E<!<-/ǐG;ܫȪLMǼL +8H.aXZvlS) FwKe h\EM+vd܈@|Cj}4~Ayp^qN8O؊=O45NJ.fGZKy0L nI_5y\ދ8`(;knr m7JK R $wvzmS/ J'<ߣĆhi'3`CCॄIłI#"xEavQl*. yD:_T<'~gZY;G+54,?Fy4e=N4'is9%dhE=ǨJap?}-Nq=.Hd&Ӣ?]"Q/I4zo&Na/J0BdWA^/ ӧhHe(ۄ% ԑ0a>k&H,] j%Av&چf({JϳDⰛZV:ci۩N^ uX8N2#uZo ` ]`6S]f~^uERȗ~LFC:;X6ӚlCI֛MvHL^ր= ̞F 3jg`\CPtxnI`ذK]#H< Cb`}^흱t(3أ /{6Rۥt:4`#0UAZJb_HJPԫ$:%щ:xH$$4Yw'b: 7DI{u]8n)- vJFuxHC :HXa#ڒ`q'-Iu7fk'=ֆ:ۺR1t{w#ٮ+qD s%4VB{]:e&t9-6Si5:kZ[dz v;:LL{Bb{BkX*l=-N}KXn:TXGmɬuR⤚;ײl7t }3uHMh/PiM k8Td+nFU( BF= q=xК]'pڸ蠙}9m*g#qVʞArfʊ5w_нdJۗgZTKXՇZ?R6{1W&'a޻ѓloS Atuk4{қ:l~MctZV (wjrǝ/=BTSL:?XCm$N&I;5RC(r2u#g&)&;$Ǽ#Ě45̾juN\G*JI}>?wT&fa%g,&`{Rw =wٙ֘w7MMmnrϷ{K[.ad~Iz^d:8H)Rx_y6+PĤa`0wXV7jvDŽ ;!X`sF=(_Kꊥ~j+7OЖЊ{Z)l}G{VA֣eaE/W!kx{v4jBw3GӹGl0vx^ ZpS힜,%~'$\3=gGw#5ȩ@l<5*V\dI"l 8G]ˆ(R@@t{C$:^q[RJoL)IOtk2#QS!6vWΝ=1ONQCBktsćKC /UT[p>]/wN //!Vv[ ֕/=^x4 t- &W:'/c5gy/?GF ^{8'nҭSO!S+Du*5 r#A 'cާCv-"vBj_Q`\Ցs$_54~Xib$!ꍲ0`Aq(RBBd2ɰ1f'xp_yꬰS \J ܹ 5 N}u  l2C|*<$)^Pp/5Z+uor5und&Ov e;~UV[c«>M;ևbW ]VTkaihΔoi7W9!ˇ {GВ4Se0lj/{@USW| bΧ;LbSڒά(~H'qvLt>Z/)դmnقv(;ZG`w$r&g@!:?|_q3ӹe!Y`ŽXO/uW0$Y}umLS՞ƿn\wlW c1)NnGqfྒྷѨvHaB'?|hB8ve_)Etw_O31DtiJGUǷ52 , 6~b@K{,9]ö́(>E2r +2t67&d<:!zUvk7ڻD$фA[\ 0V) cbRbrКS 靜"yJ2v(Q<ڢ_Հn%st,Pl@&Xi͈>?2Z Zp>`'ѰbG8 K+B+5]}/5ԟݮ='vk"Q!y`f$qڴTWl>Y!br߯ڪD0+V/2I E$yrޟ(SP K` k5\}WCHrqծYdt|N;4~R? #G$Ng/7boT~.Ùܿ!}9R)-8ʹ7 ~Z^~/ ]cxkaWol 7qƱ)B9.HL0#C-ᐌ0!qw%'J><9W(2ԄK>} Sa+*[6wJ0-j.nl˱ <OUIBU [E+W΄Oy݀6}?Y$)~;b5[oryyanP{\gF$Jq Ͻx+mxp|ثCZalOwJ1x9|gS)*_7k@:AzyG!. nHɚ 쫯!^`J&zi(49A2qM2L_?bY|Гg8>Cp! "yo 5Cv}~5> 8DSU fܝ #?M)X2-<~ q|4V34%iŜ=C s22GTc%=(V24yL`2_Ie: ޘn@.Y䧞E(L<ƐM^r& 49E{cn71J2kT/Oai`׀rs=۳y B܉xߨ 4b=r&~@PuXL^m @ZJFsvҡlQ=]o7qPfQԳkz-ڊ5 >m} E2-=yALwp_(3Ǩt^SãZW9bʦ>D׹ZPkS,# gF|@2 s-ȘS"旟[ MQ ҏAoEʂ't '(ijXRt` AV'!NQip ETi8IHY3e c^,ÃիHDҕ_Uk%ct%hF0 N`AZA?.[ܚ'd1dD!QDIjML0T&(JS,Id)KCP32,bp)py6Duy<1{2,>.t\41 k+p֒A35Zք/--/{z_au;9DZHU;AK/N4鷟.zSq]És#g8ޏY6LS8 )A08X^ε "`;2_,WW?DXk/ Y*Yx0w'>npG`I)ҁNK {H *+ xPfZO`}Xf*o^T,a8RkoA0|56``4!`C&`MB)E"5S](ZDf, /ɳ^]{w6z#lDȕ}'V ygZ2(.j'@iK0ެc˔NqϿEԦeM7O׵:.~004O>E؆Kq5b=Ro)ILE&I"9B N $C ͓4՘: XPcc=,a¿=$$S V40IRPJB,dÐ,d!gr(Vh֕ kC {s9#k{^ Jxћbnc 㾹hǥɱ/hyDE:w cU x0dﵵjޯMr'?NhZş|&%Z!⺗AǽO iCof9O,FˍfPhVۈ ,~1JT 0J}sJ 8LO;G+ bCN ~Mv|! "svh\7kuZ5u(,#X`4|QBьdAq[ yFet]iFz+|I"gP5k$VSxHMia`=\/Y~wooU[|t86hW;ǣ.M^9+<{*U<4z߬e5D8xss^ O0Bܱtx+bQb9Y#qkZ5D8ORQl8kcn𝺢w8mn'6>0* c9P!RRXs)*fj|a=qg:oC:5 K%($ ӰbǾ<\.&˸ P}3Fvd :T0Η3aar Z~H/#\~j7GPQg' >1Y FhyYyOϻ_n^Rhe vxu /6 =b EZ4T`ɔȴ^2u gom RCUsdf7m}Bn:_ q!L0l59$l |apspl6ndirޯl@Q\鰟f_C= qxc7&B+̪1b ?sJzo>KA8b^~LYh\:!gsog;}QQǰ3ɟڦ^xum??OA4w7OAC>#GăNq[q?Eg">0 4i()Fj#{ahfJd F5W0Wb̧$|=]j Y#H  Dh,7p[4(N=d{Fe>-#B>W怨jps;Ym7 vDM& IX`k d R%h wB fwsb8kV2)+j8pw" |%KaFaS +Qx y:B&H[QΦ`Q:Q4$\`gS`xVo [(B I3{ Gs#\GUظr7dQf'XJ`Fh)Nb%吁ew=em^,  ? QޞonIOS@{cJv?8ත,ەUDPreL"nx*_=\;^SJ[M NյHq 3xU[)-+rmrThQA ʴ 62!.LMn q|P'kĈr R6]>&AOOSexZM&X42/1[2*h^W'-X ʋ)g Ε8P1] v$+BX2'cқ=MZdYCR [|῰7U" Gc+Qx(x:LOTL7E6iTt|GuM8HD5CjĔpP # ;dc"|7oD+%S}sHR%8:xoĬ)US:UXH4Y[Kwr~F1J0*V "j\0HIŌ c2VGod~ $bl6 sؕdކԩsXW*b|8֍ϕy&DR.؛∷ɷ}9i%vG ~9H9} X9@h|FQ.lcF rds{s߼5iVe7b9b_z5 4Ǹa ݗVKM~_u\`=V5Zq$OAyhc6Pqbl:$x)}fk9F}W05 wҍňF4^(ǧd:e]lpa^LԨFql9WH(ֹC#QL;Mt\69US\KF9ǹu;xUetQ# dtyT@&OR!]9AmBAo^Xa_3`De// [\8idZ57F5{y3B>9/Ht6 OƧn;ּW5rB-neL QP-0 hWX,(ѷp7lYqT7TΘc PbHaB=HUi&c1G3%)YfOJ "䔧)]@Xa5[ O??MQFXcLڍӀicū :l]HMBo>}0,lYݰ9#-ފs}uN޹|F  M;Ĥj h2:|*owW^:m@6pD;}TUzU"M}{.șa4f$u kn̮1HHxWw p'^TmUl3ꭾn#QS ܵwj`& 0c6?t3™x<Oaj[ϟu?A`,T9`ގ?YeRƗ($/7H Kn#Zyح>XRtg˸% @]N]$R=,>,"EQ0GH$P557b| ]~e8Qޡõp+UbU*k6Ӵ|{[G82.AaaRC_b25t1Cy͠:`t} 8K[W},{ޓN]?M/2_`o\Dv | ֯4Xf T HRWjRZ ?,X[XA ia0R&%1Tઓ h8&Z0 R^a&&o1D<|Ġ劘Μd>f? \A !#D %8~ӟOC+2/w+.o;c#$ޞ3}8P:KLzW#f>O(ezųENmѠV+wD% d^U[55x[x^O p0pŲL$ڑp>g@XY%9RRBP0aUyjpl]h_ =ְc ow5gb dt0%A$qsQ3MpPӀ)1^weq$G|5%`f a`la$)u wG6bKbU5fa!GEUDfq@tk *7*ybT'7W{W=vmŕ6a4^0R7 $O IYK$dlFfp0bALß-[t^ᾏ|]՞L؜C?80!ĖT׏>0qF 2%@^lRϢDZ5J遗{nry&ࡃ.;p6(z@IKUXVBF쥧:ov3-}ypp89LӇng"}Mcӧv={#җ'$pS&h.|c^4.-Wo5d Cq*UCr][{݇0I֊}6zg `htƊM}VawCv=ow75"ܟ|icȚn] ͔uI}Sj_ʡNE*ˡ85mY5w/Ǻ\}Flb9.s>b0ws[&ojOK+&̹t RBE1W"u,贇yb>/fTm>.ԫTED;h(𓕮m! Rn)s/ʸިW$Z擃 "EDDZUn.]5٘T9'$<3d[ctړF uD` |L>ew$܎S;*=i/ݥHj˕`3ݹKE|.gG-KOlfhY2jl[~yfe֨nt0]lbjrTa8z˓0kyVXWq?7/2=Y>Ukީ$gˠ'lncУhs|:G =RlN[Sr+3D3~rvv[}m*k<:ћ hc*7C?Q Tܭscw䌏!Ǯlg80C{1A6 :dIhs9kTP 9ޛg0sd9l܆Rc'CFi"#eBiVLTbC4XEj9Kd"!.HcDȂM-ۭσ oC ~*`~- =<'!zVGɜ?J17O$sA<(~|2L=`{Uwr]K\EcKI$Zm. %8֘'ʱyyߗ mS\$sSC^#v\H'JϛS4_=IEbA^L!Gg?ٴu@d -)(.bb-xU2H`=[ϾS8C|ϧZW{Me@=WNS{@+~m>6{2y-e3Ryg}r"s0VcDe[אAf%S/Iyu73(’NF%3_Y4dzf;X,zjÃP/d//NR$6qcE?NOyޥpAPރ&A $d;$W +DkXU@\` WWLd>Ί}.ޝ}=}fE5\lpGe lU<p ( Vċqȟ}4';7q%=KA54v@*f[f'\!RMT4xA8Fue _WэZ|?Cċˡz$8b M4Qt1mcXV.({Qk7Q|IV:@x'ѩNDם-M |N7[dF{P Cq6J5w|n1_hs^@GlDPa\q̈́6rn87yC'G ^ƳoyprxjuxY-Wvg󔐶}}EU؍HIm8ůY/u&PZ_4Ғqx}'GԤ͌ĩ]X~]I4vf @8)7^PRZGBs~A3,¦/=ckWwݾ՚K57RŻ_^F  7_!xdk-muq,&qz}AYMjfjX87iAc:>uKZ1 r -oчzS"8L#J1Q0|05L ϻ0ԭ~Ҧ+ ye8K_kJھZtg?_ǩ cB^;Ln yWGgJFXY6>r 9kS Z/ ICb#947a/4^c(7{-bO1UgZKq/JҷD-`~鑘\@Ƴ(n4&'6hz(Nu<3q0=5%'OP7+&GN_#<k gQ"|/[M^$Av7ӽtj>PxVjJBMgEOXf̌f=Fãt}*)DV'(D.Uد}̀U{1$FGS0;g[锥è)ncP16>WjEUו$+H1i k|:noV[myX$l;i{9Bs74V2Fϡ94\TZc־Jxhh2a|&U:x: !))A Y\3Р\lGS hpmal?/onD$=:^^n>S\hɠFi2%eEzYv 2wni9}MK.IT( %ABq€IE`,0!Z/LGGZ2E&m`;a0n<撑f&\'b̆+|d%)P~vz QKс EnSu"?n.(]^WrhNq1G(FfDaClr3sh\tH`C9XZXT  B'7\$U ɺX s Ƙ,X.2 %5z%qd9j]D$<.E&x 4"=Yr5,OL!JP=>Is<ƣze+m@4Y+Y_<&yD}X=7>C$oTR#M-,Η"[O gLg5` m&"dk&$-gmr6N#Lt_$9s6!G($V3b$%7^ʪImP3B#Pj-\02%9tȃ@e)HGF  !6':}*MŭҠɖ`c[Iȁ?>V&Z"(2oZHk֫К5~" ֬DKk6dg0\`RTE0zIHpHiDIB! -!qaq-r?d;OC&xhR9QxD*$,OQ`Gp. iR5Y+q}I~,orwSle0 O/ܬs|tz8L +$e"I@ߒ<o/P#T[JXnOH:;*I:x2LJ/}M¡Uh ao 9.e0z6,y>pޮaM#tVϴ5"IF&E毴DF@IdVE w9=; W$E (b_2HT&@fv D!CiY)!%m g^/i$A IZ0QH-1?][o\7+"*FvfHE^v5v,ZKv-uiuVFN8N9Eu9t1GfqDnVwrqO\P6-&\(Z:vf pVURGVBgRN1RNtR>t%![]~%2ʅ*)iKD9uUӪ`S(:8q͜mL>EeⵄOGbX&D7#`QQ<_OLCT5Ҿ%I$/~t@|H\z!x~s|K׋`;s/ٲ}?ԘIGh%=i/a}h٠Klw?_Mku-FO}Η4^=sK2_]d(*b$^5*+|W~ngU3` pneuv&(UT'} 0zH?(9;t`e_JTXlR!3sP"U -.U"KE*59׍\; G&ლ)쁻[}o= oYGQXn\ BݷkOK{W>/ -Ox[q^}o4HG-G斗kFQpF"3yv+A36Nc'UT7hUU4q1p$聻 |&O}ކ(q?IUn*#+ř4ލ >nIR3te+[rHv޽+luꍷs0~ũwuwH.oCKԻ{QAO"~^Τ'❧'uB`5 6w_hX*OO.旵!by4'[39㢫FZ` p!Yid);5Tq".uT/*N:^M:ٚ8Z|jV޶Q {﫹*Y$ 椞u̎ ÈۀS=#/^"Ύ]9L)3a2k޼ 1JH5MuKXxtB-$ 6E$ l$ifȠ8IBDL'g/_sQA> 8f#qyt_@%seݪ\Ga57n%fuxv*wzkmTLs*œY?P{"!ɠ`Wk+w)qb,I妷1u zF>TO%r v[ꀍnQ]KwE=']p̉RhRMb03*F*)$SPp*E*]$FZe> H)_-vќ_OdbOճ7A`roIHi'F}`[}hΫ_v x۩_$ѩ26Ih=>Ǭ'gh18zu@W2@WOґzRBpw/2 *D 5}0nԆnDT֫? AG[ 6+zM)VZl'ڽrM:@df}·rʛCh\%!2}_Q}aMqYdی"f6"ۻe<:%;ݔ#yWQB ċIHɠ#YzEU%c0U֛k*dʑ2w%b!)}3CՋʁ/>8/]q'YMɛSMaEW KLkMX]W䩈egCFѐ\J9CQMךW68tȧt~0RdJTtގ߯ C@uSykd7%h^+\K5O&!T1^O }'ݰ!Q8Jh.ŢX*j!܋k \ PcNsYKIPՑh) h*g]SI䋚d#UoIcD8҉Wb5&t_sLDZszokDi 2Y V|jPI Jâ4K90j٨YMSJz'Fဘgdky`ͨkر%{Ūb4]"XԧDVXI5Gkg<3Q#iJn6һG+19 $ mnq91'c WN>f&$&YmATc{TsȀp-5HL5P;35^$H=Ll;p JJhш*wNɘu;ď2DfI)=JrRbM%%v΄g ϞO# j1CU*42DQ`*6b,\"~z0r'H]6 Cu1P&[ S@R7) ofi'[PbcO2n!p}'먨8)|igNdž&Aߎ/q*N.^n.YfK(rΨ(E ثr]WB;ˁ[ ɻwWu`/?Cv lV2^53x 6J/بW%FtJ=Z- )Ij،ZZ`:ӌTܫ/5Q?9"%ݭ/qP MKͽ=0yyqCL XЬSӤ#Qњ3!=Z2Mq$}Hi$#}GT<M :q~ݯ9=b}v!sO=w"=HrB8n"^0攊tT) ۔j`_RwQ,Ta7}.>XŸyMoh%cN -[ T}\poV#cmh ;v`5g,*! =HoP ɉ&a<_7cn1}o۩ckGǠ?sS +ױ;IN^X gi׈/)֛x6Mԣ)[T&AJfz,QIs hug ;|"O_M8̀OR $9qa0s&i*Z5ZH9 ?jֱc_s>ZGw ǞX֑ nN{o:ɿ,*>GG'WMߒ}>|gkxʍo=>`hŜ yM#_ÉVŪeJoG*^؍W>^w~XF4k_}b8JVXL0KXۆeuhY9PfOSF\2KXfھDـ7jxvUe~\>\L\A!n#֕ʀAל J -S"oInN(d(R\}tә},&(,_,gc$ ^ ylJE0sz(. o&^r!JY}+ ʹ*ɷRb YcMs5+I*)Ffxaq A5K z}bM\;La[kӻhp^al?wTE5/Z %_@}b,,zi^]2G!6P@㟅~at ;y7|$D&Ϊ{ abi Rrdɚu:_zLj]mH7>$cܖϔlg+YQAQATCu (zRyۘGxl}(ȧU, [?`8m7x(%UQ7zWnKoP`^jA䡱}Rx:fWR_$\VԬ7H0Rj%k0\ّc:WfuGjEjAB!/P Ԏ&*OCVTJ=Kt~8WLt$>85go*(#IN\ m9@ Ot͗ |ڷ_| TP۫k$[v)v 4>#\^\ȩb՝jlW_89u.뮅ݮ≰ٰi 8mUÎ_R TT%'SSTpyCe'cc>^wIl5j 6:xN0w,:8}\H-#3{cAƙɰRâWJܝ9.7?^o|z؃;|AfrYϊ?q=VxҀeIyiO[P SyyiLCt\ύHarUo]'^RZ<:$:b59ra3Zza0֋cdi0U)9 J֨"Eq ',OZڻ>LUy0_RJ2ẽuYZ0t na %gn?~M}6VRFQBesdQ|_EhQ-ɫ=*;.\?.~cO <[:tGxLDewI 9k(D '9&z 99FLŢ#"--I.ziӐT(S{e9zLmT\[TG(d@Ft|L* KKVALSK$. u0N UZ} %̑pgSF;7=} [I|* ʶ! W`^1ڪ &/,&DKBNVUƹg$ɖL8c&[*uiU92')-j?Wаvi(hFpl滑:kxi*?HBƄp@4\jh7O "I2oY/nj2 z徸#`ifȖ3yXi߬jգ^uAB|UdB6.ʑee[#X)IPp*7*{eߒ:άl=ZjC5U}5R29.V'Y,ŗ(%Rq}<2IV^Q-qT"!>mKBC@ؼĈ|ؗb,Q I?UCCr!%YLjm:@.6D5--:7bu{ Eԙ+wKdka⏈ö* ݾ܇5>VNp}  :_|`\q^nTZ_[\mGkV/k[a=w)0pIM-'h5Ss&kħ?:iRꡂg%> pr!"l.\ " tQ)24Y2E.ri7Kz a9*- R'Jjeim.Mֶ-(p}q'_>,mx^X++DITΖ<I-.@I58| ڮ5[EZ*WÿlWi#9[V V>*JQ0O69f9rKp3(DyBLΦepjw-[RN~ǽ{x2*fkͽH]~Tf5{j#-9r 64nݭy>ƌ5ز;s5{,㴼 IWBK8ص/`)&`jEJ] ]WBAI|wg\wq|>^BVUZWOS IY Cq!&< e*a-R-+brR|R7O[r`? N\0`R{٨/8A 'ey.c%1Kg^Kֳ,*k, VRPh #ψDh$T;Iu": 8<1QĤڡXsz$8aG9u&.],hj i[Y=JnY+uV`&|`+dhaM\s"Gm5yɼwwrT4 ͊ &a\0 \rLᙴQ4\yy'L"@ρqJy&CM@)8نl),I *HbqF*=՘$<eZ2NLAފ! sJG9] n(oޅV['@.+O0~9 &ObI֫LN$T.K0Y/0IiGk]gn <^UtueN,ö; Dx6M(J}lnbq[8pZ.Hϱ@Tim{ߘ >KMq|]9{oI,> Bp*LL?ϻu$قJ3#3g&#xlŌ=\Vfg(Vӣ5<`~ ٍA>26lx(v'0s4zGavV| -Sœ|'Qr:4䞽qc5h]dl|ͳq3Z_,cvN%^`r!|跌蕱]{ji/)bQ>h  Y=Zg$ǻ,']D侘hgWŜ\06HY#ߌLݴ3rL;+"yS:ˮK>EYu5㫟_yu_ߎ8fP^ژ}Z:`l.n ɟ{R:=X~ZWTl R!&h%eݙTR;34-Зdf jty*+4ggۆ pIA=ig0$#Dt}1'h]# 1JuK278e]Q4E z^!󰻊5RíjGۙv/Դƚj{2sipf17@i{7ѕT.DAL3R/ N^,y:p؆nq7L Nmg>ka]VV TPRtlg7FV[)6uE6;R E忿RN*ǿ9'΋枎sp o//9A[i , tB3M&ɛR]>A,/ 8U4ayG:w?(כhI,pbaZ,1M tNPހ (W86_oj2GSZ?z響`4? nnW~B-DϜ޽Oi"u68EGk0,V.0. ^Sq.TIEX<'vo%R>T}Re4y{1c=ij]w]&!RnQGuz+=Nr~̍{>Yv8Y7u_?ēozӭ-ٟ1b9'=!9#',{G 5F`D*J)ɉ KZ$Z~Tr@J/6^XztiL[(參;M2 9IYJ 6L01֧HN`#Vs0夶I? TJKd"N &Z!Qwk;-BN.HSI!J(  ȁx쩋e#sFyZ;<(⥏YKe:f"(-iK+ 0ǘ9<8ё8)XmHKW&T&퇩ðކHD)v|a$F㞰7!9eԂ2!H\APrQLp$_ ka~_N(\֕{#uj,^-wbK:j (-TFP&17+w#R%44% "%yp$+mFq[mL `Ϡ°1>fШa~ycā& Jx֥pހpRhY4Y5r`EI"ځr,SJ~Fye#G#,%? |La]-'-(!i%ԤOnR3>\?ٸo.+zFYppEJC]8ufYuα?5{VZ% ;Xz2iU,F8.nzbYǦTLw'ߴJcvjfb3UYLLU2jrOL &Wd2|88Gf彎=6|ͳa+HsftM=ɃȮldݞhe \Upu&7nF]I۪N|+}SF;NSn"y;\]uo_x(p%rBqd%o=)S-:HE CH3Fٺ4ʹyoo #EP+6] JYaC4\LI XwM?7[ʙ=^a_r7p|SF*7G]P-f2aqo 84/}Ҩh- !&۰;Na&P{']|>z>rumz۳f"F%>wcjtv#yzD4_#PMֲ.];ir $$a w=(h-hj軮:Op:[I*|'8EMh2wq1]%V+3,T(~KW]& c{94ti.Ң,w Xyx9^CO pgˍc xH$'*rtĀ"âX U/Z| *pQ.5ÄQLd{u f{$LꋖPP:0N`G GR<ǹ0% 0Sa,4!% Pi\K= #$Zqii#8Ҕ8.aDTm"kZD#*$"%LZiDF1$jP^!cjMDKaB DYd-]Z4n\ӦJ/XP\2ǥRޠh&^bՑHhnYuMgμ+Ld)L=}'AC#eAhvo- Ŭ%`e⩮[,ZUVQ횐g͉wR(*2MFv{jd]!ŜIKp煂B" `-cmU < ` ,xA0.K%\b4Ndzvr#pQl\wbۺ_lo/b4YzQz6El㾮 V!%^\bg*`8R.$7Mt599QT xUm>6KЊI  c/}v*U/Сu|YlgcKaW3uY ֱiPNӟ^塁ZHz(SKr"!SH}ZY$0.`ZBm cBsKZ]~_Lx^W&3/,SN-SHs_r P;x p nwT瞫w߽{~!_2OQ-绿㛿nF"kaq C48=8ᩯ?8N9SB ].ƜsjTDOwW16*O"j "a}80 ֋o %*rkY{.DjywaFUy\70_ q vˎ7 >*&=_Ť竘|UOz^>Mo&yQ K%6ZSA@2l`R`{"bH9x)dH͏]:W1U gmDqu،=L*U- M:BB‰ 5+p {BjSROHx^ E ~'}}} _#c4P"FiNBt)JsIѥWLi!WJ `L{ Q bSl(:sS#Hs?Cr='0H{6t"NZK(4L^:. T k-ֱPݧfBeĐRԈwsLֿLMYPIۯr23f1E_+1.9Ռ8U)sUO3cʊeWGGd2Ϸ2at;~<%1%}{{9jU}ŋ;2<:6VAϝ8>?TB '/Xjckx WFoߘN>;xk?قv stD ː:|giC4ǓCL\+W1q*&\W9TVS8'<)2l^,uFcc(!ANΡ(+*wYk&4;|f<!2Tw֬n0fxOhb{2a.84% .Qa<6i4z x^?_|.p2G I/[ Θ' 5X, (v]a'.DXiն ; BmƆH>.Xjuణk ,{\hzdKrv4aЫzsBbNU='6 `_$#q U +L09R#.9l;PVtX0cJX;ZCka9V W0_ s]cvFYUB`rEŜ1q-t&41'kl@Y;zM9}fǯ䷡W d]66϶es̐z vU̵ 3 x=P4H3lT4J6dr3k.ĉliB&#ba *V{SUmN`B "6c  UL .1heitj‚x s Ń+514V_3+@9!'19fq {or雮4e罅.luت%j)J)H#)S76nL2;~@$NIkZi[R٘Cٞd]?\$=6^<|7r( ,)%64;e)ZkQ"Y $w80\JsFh׎6Jw]ΖQuIJsژ,1$icn/nwdi#Z|p/qc:Xhp 7`L3k>[,$K@/#`EK9QW7$)wic67HKr<|%wJGJ( u\*A$#)ixlAwhh}^c_..E˦J-;p6zpS䨑N%ioisD=z^V Ts%F#mF1H~qwZ.x\r<ڨM?o%EU0޹řG5>ތ1u#f~wc?,C,<ʷoxS;b摠~fK`f󣉭2=W]܋[ r/ ᮰8Z{tӛbZWnBzY0A&ے/ST%kBP~tUnb~Xb(?Ż}>8x#9Psx)4-U]Bfr[܁#Uw]_Ja4J_,z}.nnwSq3}$|# +dֺH+*^bNK.\k%6Ѱ2L?nPPU ţkʶzߓoʼ7BG{gG!s= ӑY=QaB2ͧݘUt&YmgdKhaZ~ 6(n۫@sXQK]z|.rʗ1f8Mdzw]ucزp FaU|U,.i)/ūOfj|c_-~Y* c_I&iג"> c,Qg%m3 8%'@2:*e .,58u"<}7:/>k*<=<'a%$Ni((+mDFG&m@քcSv?"& ~`A%:_1Tb~,+ݙ&ɄR-_ rPR3c?ʝ4d1H{W)~l6NjW[&ݟT`!8)`"t__YaQ澠OT0Ŏ1vYؔaZn5=z<팒}3yd`p7#:i՞2@jfnƓmyxrS5DFWw`[zy&ڮ;C8n+tDD>Һ@ySK Ǝ|^'R[Dhs2qF£l2[l,\Ăd@lvqrMm&0tKTOWHtPP Ek'1rHA[|S91O{[.-?=6-UTw? %ZSzy틇g)t!@v6h$,H_`u2ak=8W f(S`A=YOrsMi/,*Q%f;P%c 7=hp`! ͖3 fIn0V0>]g!iSTzX;Kaϰ ]H k]gR}ˆ]5kXڅs԰PuVд@"0(΃m6<t}i(dޤ ͅO :cH;SޞP.]/!A1Y\Y0'yasH=qiWڄhM)ԊIkO'G;׼Abz D;8mWUg@%@,sV+ pizt|p$4GSJ,иeyY I q"\\ i^@k}7On4ͪDt4g8#R"Z\-(؟=s Fv]k6W~v<1~VHŒB_Q59b`QJV0.HEiY qUP~FξײWΎ1N( ğ J߶8JPB居p J2k5t$8a~q1Oqΐ+;|Y)ի(| E&g)Dl9ћ]./%w65 ( _`mWw&P)LF&pNatO\ GQi5Gkl;6_ۖ#[˔;b6TWZt3 ozC~k"'#ѫ)"],Njv``;͋E'[sʂz "'xDq`Nz^bGL"[ٓpUAh4)&,ʗ6ү7ƔJ`Y+笅 .-uhNƆ4H3J9i#eks0ٕSj7,o.P+}zM'Exi a8,_@"oYI㐕4YI㐕4f%`g G61icG [_5bõXp-ָz-Vu!fvZت{9sƹC.BR;*Hi^y/JZy/o,\ %D{¼B&3~2L[0aTL,a Fg}Mb@ }tC=nYTA%3l).q *1Fһ1όs8k[QnKiL.,ywÂwȕ\"9*Z(I̗qraz׷/_/B4[.F@7W/@bwWW|F0f-=|zp5%L+BA\+5uq0qD%" q8JDc4Sf@Ieq k҅.+l4mcכK$ cf2 -B'|) y(O(ylXVt&:1Sm9N#$SdJ?/8/ysU8,6G#ϸ@VqP\H%m!ET {hg6ԡIΉz2u)QqΧws5XtF.í5w4ɦ#ӛr9]^`}?o}skOf`.A+'D&pωbrr}EIǷݸ ͌xf4 ĐLX&@Zi;0.)26j][/:v6nF2`l)L  p-JxE`_最+ٞ$α1qnŪRƢJ{Ϣ%--ISɊ3_om%/ P|=j6l6M,~N2N)F>Z>MkFFfR=gWv}W5|1tm&4鲐Ve}/trdC"_몛*r7 ~Xc589lo I2=RыՅ0 \ v=So_/Al^ տM9Sf:gPcg+㪺2Y ->nQfR+U*{va_ lXkfXFBCbk ^d|<4}-2VNZ`@>OK*6S3=L3)읝b47 {iL`=U"+T [`@R%76 ʽ?ŬY|{3Y`}I9ՒdNc9jST$p*ːʩv)ZUrvw^T[0;W~2kkɻks{su1Q ߩd3hWsog~sݎ\'F!} b;]Qg *AX{A&ŧW?ˌ:=c 6SF`2]0\`QI$MAeB;<*yoɈsXx`/CzaM---W!?50ʢ(+Ru1\S`ÈU\\jU5Pë+}x[=nU<nj|)$՗6u-GhGak6 %ƠޙxUqI:V^?Dޞ} cq`2S [vzfTWҴ졵*tkE-r!Mcg\IϞc}2R*ۦt|RIJMez?^ΡAtĸKgwl;pTeCywF+-Z7zw(<$7E1W} n=Q*hO|unqʬ)fOfNq!1:>тG7 x~>1$2,o7^nr^]y3An/˯,>j Bߗ/.ޛ$_t /VKI2e~rkAm\P,ϏNUN{悔[#l4vڹFv(& HIV#R k&~=1G*ցzmm&=ȥ|7jx}T]r/WhOiB5dH<**-R;]=,<׊;BR aJҹI2q C' }hPU>mJ! !༈%QeÝFqKf@-w~ZkhZQBTX9F1Cy*iޥBZ,42=bDO3*1`IEaE:45(.8J3ɬu !8SNt|!6`)e Jgt :VkJؔ{%##| ʫl-S:Vq]eGeUu(*,s*q=zbRIɊ+C"!TrLDalT &&)bSF/SAe-̂m1Ie}Z+Eq`U Y !#'GWA' UXJOY.!:LIKWύ1gI\":+4JΞybM =1X3YMr`!Ln(I js4kFnŴZQ:ҁaeq)#y:)X%Lmd9rNLe)8SƦ`S"#RU%g(颻F`m%[x.htJhL4 )#'2x(.Q]!Zhx \Ӌ0 &QGWB}OR_xI|6QHpkܖ}GߓGqPd$| ܙQSNT,LI[AQ/ylT!D4$0u FkG 00J>``UF{'WqAHMfD!(ZNrZ-ɮH:%r.;4+ڣ`Q$k< -8A27r姪z`tQCc2 !l2h_ESd}E#Z(QJ')Rn1P,ɱ$HqVȢPIad"I.§E=\C{̘pgFNח'gn̅,^/;jk˧'zd'dϟo(}ëO,$PD|qխRN/ḫD=q(O{^n~]\߹Xsz"[m0c)7ۓ/ĕ:&"ʯ[ڔ*<b:b&`% y+auķ+I6xC@P>LBoX걭Mťd=hzm"pi y>;(hs9JZ&92 3[(?%iL[Cd"G^xA/KB*p@Z#dSRg (K!$x>WQ(KE#(⠄/:'ٸļFEuSb䅸-PM PZQ: Ir_: s4cɐSꬃeZ+흕|QU.ЭQF#- IJ`b2II!2.dD)wRc9=U଩2ML)N%}ԒM Y) )Us0 R#J+ !{) AQ22K>[&o5u73C Hw& i7Q!D9@vI* \)aD~E£Zh ڜh;tDm):&֢H֌`E+MceKRc[RڐJ2H%4VkR9GcN )_El[ ܷ(mѵ3+<XYI+W ADEj}Ax%,Pdx)-#3w۶6r) %w-~ y4$PZ~\Q^Yl[P RU@$On yF(\}'B!S0޶mE5y;vP[%HRͤWVmz =MI'S !=ESZucʶ.&\s彡wۖ6Vp!Z͓3G8c6C:@3}@jw$P3ۀG h.lKPFA97n4&AGEc egX 6t >~G݊~wb.e5Oq+=wA %G²V; lv1[|_#nWM(aaY)澘>gA Qu^񓺬=> l 4r|}s3]34&[fXbqwHVEGEل":$q 쀲{T.AkXL`UrQryL^FIm^PBӽS; ZiNs8K'.),ÆhVhN}ȧ+iw}a;-#~G:JQc - a+[miP1Љ W4=>/=iYiEv: f,pٴx~ot[6Ƚ\x}|(I|fP@(f:>|Mp9\YTvo.lƍ n2DB<|2nrOǿ&7{uvGzfwLaa;-5q8/'!_ywǓg7&)\BhUu`o/vR-uˮ%EDYER0.jI*'r,վ)]Ƴ"}=z&uDK Ar9K]|,۴&MaqkK1M^Lbcq&.by˝}~?zw`;=/NjtUDjxF8a[NbEZS0W߹ߧs/>ܱ-s7w>\Lڍ?8DaJGI e#v< A }Gv;`t`Q17v^#?8Daqc8 Kbd:n"Ww+?8DV^>G3~s xr%v辮]#Ǭl3EjPR1]8=9X[oV*'o~RޮڟU_ZʨW k}+і݊mi頎ugtšla)*^,^=?ƞ7f$1GuI_U"'uO9լQM4\^,B֋neJ<s2Ƭ|ro\ {yO>w9V| 6jCMBI9I?y-2Isy&پ -&i,$w;qA08+He-3LC]F]-6;ٺ f(tYDBHr*rG8~yqr _bjG₆; H&"O<-4W0h+u96 7޾~;ǎ8VťxQ<.J:[M痷|?_Df>JPLyzq5.)|xƫrV%uʜ`L#^7S*j̡tiC-~HS'A-s:F)|衏Z W{ /?oWra <}Ve2~6d\O\}54TOe<(zbGv'w5L*pg?[YQgwoxi# w: `规\[uf_9L[CdȘ69 Gb:%4&@B\* J-x4%a6L4…tH-FѻƘD{F |9f!`u$ $zzz%Q&=l4HgX-#t]U]]%ttrUfI$A2n|,x7|"%S57u[CB7Pa#i$k4J"AJgP@Brm/SOlr QHfMLH1&wcNhོvQfHs&a}Ns?R3E>~̲4pDžl}1@b4aZu&uF Kc&ENl#h<4v MDt-| o/Di[WpECIzdR618PX:F'Zw_iftGĴy.5Jf-@94K!Dw!4No8}`F?ůFakњ n?[(C@.FMRהnYn̖R7 k1{n̩JGBildﵖ;ۜD×,&W$hdSMgz0u-ޡ+R)׾8v\M|!r‹/XgGĨ )$b^=WNa2\pL W/JG9))L!o9)Rȇ<|ǔ0KO˹95]A@߃r-"KR\n8@ t Cg#Mw lЁTS  WM ^!Zۣ{Xg󉏾t [H468w8#@]t*'ÎGYDb%8IE"4)Qf7))JY㎏|C1"'x3H!VjؚUǃ/xji=}2DukƤo=2&0 ,:zzS9S1aFݶ28$%{1?"xjv jnQZb?ó!wN<~)8 DPȁ4nӃBnXf,T(qHmapJDamNB~YwҐ>kOw4o5eSK蓝Ywy}頇~S >s)hY&-s36:`!d:Թ v>e%%:,N p bK5U, "0rD8:n`?gjr &Woy_OrCx՝f_N^ob$M󆐕Hi* TqŕZ@@R)\GIPcҁoQy!'+#Ir%*D*imda MXЩئHi]  qY:́NjDmɍv9E%yilN(y A“.S[׳*8^ad/jT|hbtMaNU;T7Aȿ5n!^h%Ka1Qz[ZAʭfD0XaJ0'rs*5@PQI-t4#h콝io?!ʹosm?/Y󓫾++j~zRܮR9.Noݫ_ \oțwߞ~ (n^DDh]M~|u2)r͝o///O[UL:,LwῙ 5<|ۚ;\3$埉/@1-*];W$F UĆ@DҔBwa]#^D#M% j!m*^dzz0%`ts\D:/eU S[S/[}p΂LnhPasUQ8AU."D?]Pea[h tj+ vq|5+KAei6Ühr~!`7}Ao#LCȱȭ8cZ3mM`m>YO63-(KycBs{HԙgkP2\`&ov00y&Z*XIǩr9(eQ+DXɌay!)Oh 5ZI6I<C-FCl#DChj~!OȎ>^1b8#؟P )8r9B=L DXQ M(+yQ>,Fc 6!: nxw5{NmCM}vP;!ȹFI D}/, [N !lKB"fq5S-X|ݿW(V 6p ÎX"a@s̱Fq&B#+V:/"BֲܣنZ2@4Z9"^oU1[tIYݧ;7+`i8_7UjbUꯇ=6Oζ#^>>]̿EDx*zoUq?]M~ jVS_ͼp+W$+icݩ[7]hbi":k4nz_J!ܟuF4Ժ\DdV?VuSZXNMېQfݢ n)$+2E}h϶ޚo 8Il0LBc* 9WU@ZY ȚV!DyK*{#{M8Ƀ${crY꣩"Q+kfëQ*_AWڄ[zwBk.][s1B 1qj{E0GWO\1FfQ#KkbW Eo4mDUr^cNCIrwJLowFA8]YrR]Ȯ ֋29Jj4(ׇ*pMZu}(S7 c%(QPް!Ū^}fcX;rW^2>_L^}[bzq}Uy#A6D"['p}e-fiLJޑliu;g4)e3 ;XR4eJMK$BOڣ%j<+AMW -Gc' `'&`Kظ-VZxQsw0v#j(H{ΆЊ(b:W(g; 1aCu:loUӒh$?I!u<&@N=ZKY/Y(/]>P7yn ַ]x,VUS9}1  3?֖KDKhmK<a캜^|knI]}0Q5/<ՖX;e|guſ|Z~zu"Urf]lR8D\.]Ɛ`,-M0 x {x]ck!/9+ PAIA;z_4̍G N(tNnN`:ת#-acءQ]|O7 FrĄ>䚊ʑ;GmO:)8 $;{Ԇe!ax‘Vթ.#i֩Da\fsREVhե3 d6yW_ 0˓SO/zSB jhy^;yB4aa IQ͈Xjwy*Lw:72Q),x6DsALYQJGLo##f[bLc0!oWS(հ\ M>C87MUb_ q]SDRdicFkEɍ2ڰ\\2G9gK\ ,,#BEZr[(E\k]@0d)@c/¥TxSL5s#tK v/0 RoigCbiT* Sdٻ&7ncWXr'U%)S')";yk D{.W|C.9$gDqE9u7Ƈ\`iCCD(>OņA+WH~\~>R00Hi,V"->"7kB\7 >~aꃞQHc^Hݠb-r-ӽk<9$a|AiIr,v v8znym'ZCܾ!|dSBȘ]f>dS@ >Q`$~qeXZK!̯ ,W/^9A40.`TcWA1.?dh$fFȔ\4Eъ~w9FsM,t{gKh3qԚ!!gEix|4 p->e\ݍ([B$')[C/ʖ1_l%Ͻ+/xm!JU)5\6ɋ ]݂~'%ܥ$ d9X^F160˕ZSKs%tqӝh6 ֒Y?Y}qHn1.% i(b\JȬ̫p SD@vaZHkN`]@iSώsD 9B0l\^xP:4wD 0*l lȧo{ozJC#j*+Rhrx4`ʆ'=բ U4I}$}2Q6x(&̻%4ֻu!߸&Tbw#d[ bT'&mU1wKhwBCq)S҇nӎrUż ChP]m4z\~ޏbKm{Rars]s)a'IK](* Ԃo> ;Z^=öȌMoVw(/滐)gyv|mɆ#3Ox_+K_ Z,?2s-ln^nË;=>90e򪽐)(`5]1y6xWOV9h6% A k3Ũ˸N$YpX\Xuga˕{\ M DbӉǛ(tPʐ$ի%9PqݏTl/l[ڡC]};A62% 4KF́` Xn4* &"߅oj74;~],^L# A@qʱS0\D4JD){ ~.f"ӭ%c{(AAN@l Bɩ5Bg3Fk*E9RaN%%H%5wC.[ϔ/?ʄ1~e}`ϱY1sVg<#~ _srE9RB@Gy048eCK5 ʙW? O#];<"!jҽymXNs "\/)j%!_`[ie +]` )DZoiFӘ;̽# @cW* Nn*$0YL%1w&LӤɡ#"O~! L 1>:}Y=~pt1O~KU{[?| ]_?iA99/@I:)!iI1(e{<!{B)6P"M\s$1 a_9)Ӂ8 ˚ΩmҩNqV-s.?/&%n7Ra9EG`vm=9k[=3|%@w yڇ#Ϋ9{<$p%KS;;6 MhWvlAң dtOB&e}97׌6^YC aE4_/2'{\UpW^]~~ 﫜Pb#0vJSA2H-ZVbG)v 'f@I_a4 x^?^ `odٚg0I>$7Z9fP©j ?ɨӼt楫0/]y]„*YAd!/VgO*X(+\tp훧/J<x%q~m}^] gpe5?__kszXՅfſ3^P˿ŠZxχVP@9A?!/)#\߹c; _^Mf1JQv"1_,L1FD¯iJnaMf#w"H0hJ Kػn}jNt !W<;+םm{RzpK8p2.@3H5ͽݽQO irW b,.I1aH 1EBɀ/V#~W {scC2QwDyޭ2A8yaBtJ݉ K$VCG1=o4a~!Wp8r@RYYgJO$낟^g- #bT5;mJ۷q7H€A0x"ɼhB>M4Ͽ/@;it h̏0|}Xf-NNYX/_ &w7)mۗOY\ϳ^hn:^?<.wVd#cƕ\r?77m[<~߬>şg7un4ss?q̨/5eDŽ22ޏ4,xo4z^UwA~+P,o) ݚإjAV$_~/_ AtS~(!KɥG0<)F?$)qINbk" h+@3+46:M?ʌf|u PrDv-+uo7؟K-f &\_;<S%rc evk;D;@&}ss`yc0J15>93>i_B*/=w~5ХS$r9wfoInjv մU;bgZ6_AME`v7e+Yɇj0ؼH^"$-!QY" b4 ZG9%նA v_漖v&LK̴ykd,!!gtIs$O i#\ i:RS$X7Y(~6 Oh-ɫO'=ӢIpmTN¤S ҤkOsHuh1ڠnq (͛JfUHNMj5X5VDǚGmV\|Jt86eU?o c}4{&>Ljc.mUu~{oz-0*<쫩&Q 9Qjp&}Fy+v$.~MM, /u=a"⍪m=͓oOS78 `@{v2H<OU(ӧ 6wFequ? )TxBѐ'@l{ =gvKYS1931o~y}sMF̺ (׌B ]8;Ov>gP. QT:JɡGqݿK4*Z 0g wfh޺p3~>v `]Y3Ӎrst'ps/;,{7TzSqMmh*T:cDb ADJZ٘$$h" 9GYw/ͫ ^ O,9jCtb+W&о:ʍhQ5<Ԏkir1] Ŵ^;h a̴2)J& cnRv dMoP i4k58B@9#Kx7QBg@%a^~En?1(}lzQuU PP^UNR GQD=ux;͌f[Kb~/Y*WW~5yO ǜ/!7/(n Hzy7D;s0E_ 0r6r}N! d}YsJ p0DGDoGwRĻ(MT좘8ߓbB(bdEM $ͼ-ICۜgg 3:FI?I|{Sɮl-;\m Ov)g­.5Z~)]n4C\ځ>z{\}PhrP+wÑѯ" | rrUYQsL%0}+hB/ ^]c^ mhγ#:+`#]HGӔ'C1ˤJǎ Fe"H(ALHRsnKBR|yJ$u$sK,u0g8%,8 ಚ;ԣA?8;\'s!9gcFbpUu"1bvQ?RChҠZ=;ga2g%Bbũ'?Wf U ?*I~UP%Q/mOGf xv1\!he%iIPr/%|4n]e#$z+d;HY'Rhb%|'1RjD(zR"&"j!ge  |69HugGCf%Ƃ52V] u3zD΀b✈Zc+l; Q5JQ0@Q1rUg90 rO/XC#~FR֪\}})֧ A>R4~ y=.Dӏ%$(*Ȉ"AɰuihlnR俹~|6wپdT<*q3CņDoIYigqOEq.Gdҭ[)Pko=_FpS7^dA}mZAa7_gMW *4 3Ʋb46YQY%7z[{Y]_ݖ}+څf+-&^l)BU2csxc^vQ(-==K>t*TZ=^$zQ5*`^Y : ceS6a,C2Bı/)mlJWCnX,}UHw(qt(:& `80 SSRkŒbF pBŠ R.X&EK\MzK)B_YC'S.j- Za0~ %Mj%W T+y86{%ޝ$(!58WaȌ)x0LlπiN4 ''B.R2&^,[;OUR}lGKʺ/7}wEC*1"D?fYX`8"/0u)6FB$QJZ T̍H;a1]kdl #9 ֌+qj9 ,yQJÝl6[F"o<ė7[(v.quh}&nbIEGe ^C.zp}QE`7OFn98 g՛C榳lvNLZl..5ʂ~^Tz *K y*ZEgk[79X:UǺYʂjА'tJɳI֬@A [U vQmcNgj9>u,֭ y*ZI>uƨ>V]TjXGi99u,֭ y*ZE;5uYX촨QNn%xnUyݭ y*ZI8g!kMRU`թ>Ϸ$ Rnպա!O\EWtJSؐ[|[$Or;':=8g6F]O+/-|{)lؚ^Z7B o.YCS2upHg} ,mL>LtC5WѤl(}zqhz\ݰs"m YW rO~34.SWna;gI~87U9[6?1?}q+4!cz4bp'?{O-o^?cgϗ/`R߻fK]};̗{z']Kj߽|&l=z}7Hdlw?J/?÷rv>M&-M|: Zot`6-\Se=yg=RX"#RrZ&5fX47ZI! ʹd0"-~[wƾ^QUn {8`\<Пl2} r7DDD\@kvBK另<[{;Gٝ\[ȗ\w(wp0$W7׳|>c J]`jԎYuf"8(=Y;mZ҆-Z]Z`M¬ZĀϊb$S&j, Jbq҄# kӠ3M5g.?_!LK`Q!0"RB B ĐJS'{UW;;nnnW݅yDe!g.nGQ6Hnǣlp]buScxG0O_,prew9ʼK-;R7klD"_ػE \v`w/^ XJBDcU^{wbg+$L~ȞKc[El|(u.=_QDtǁ$h is\|\:w@?&X ȡ$J::Id;iDДJ`o_Z87P 񌄺Y7$,[Sa1gpx3X󉱀0e8}0>.Ok9Ojw2[7ىgG :R8fg- ߒb%%|6:Ti0j0V!n]b,Q}cfc$Gja!0w/I0pr ޓ6$+~p> ƾ def٢H5I1#X$dVeeQ{ИH6+"22#܀}Aö!vT_UraޛtTzV$TU^\cKy^Xx7&4gFڜd*B;0N2.|wcUt$zgV eU屪SXZ0paZm!MX!aDDP%pG!o= J[:Z DOҾ!m1,20l2JI0+ƥTNiˌH JD7nB+TJQ AEU)$a偦dK#UYqWJ4 6լ ڊFQҩUHUr.Cc Җ%OL64|CY1m@Q $- %D餑%* a'6% TQi(_>~gw=W~1tm4&tL7ayGvIp1a5ܜ}B4μ-0+5$B ޡsA7:vzWP8˔XWKNל0 eiU% eŒ`!C]#.C{҅?OFkd@MqAf`Aٍ),ĕ]֕\k~?'_]۴>34Ox`.* P㩊pw}_,@V7V%]tq}cWe\Ǽ}Q⟞tV1t:@spV.:<ӁRɛڪP[aٿ}Ów)\ԅRyt>~:Oָ[<R] %h+Ծhm/q7AbE9c~͖ h)*(eEHDTdqG'X̳ymX̃.bէOL:%eSzkpV5"oYТR pSfUXɋ/>>΀>/yPbz(ϳvLt_V}$ZrVX^fxqes~inXOqJ1yu<<8vYM6`!մ^.šMtVGds\B{zvP:ɵRlN )$~ج2Lf9n3eG=Mƙyou!5^cg lX`BF4bL/mu_8Fd 3DVmJHdZw3FIc.G EOTte۠U#=c`z&tx)Nt^69Z4޺MdA ĻA Ļf*)fk 5V הF ]0a) XI(nI}PRH{s\W;w.w ]hrG%6tF+*FJ%TiÒiAJeC;%qGSzdΕ ]p+tL`hVTX%WJTJ9UFy^R,Cp0Q+L4w>(;Rޛ;VW;.xwC ]CmrWN5b [˥Õ4pK]y)RGsGꃒ#G9< Sr]Cs.&UKR Xt.%o+Q2Rc[8"{^YRtH0}MwI u#m{HĘxuKpT ފ2 #N($fBdmHDV!,U  dʁf* Ԣpø7#\]H#bDD.^R6&*;щD"ŜeS)C~Tphdb54Ji'Se>~vb`fI6RuXDJ)8CXcHOJ=SM$z %mt$:L춝yٛnKjLj2%!lL虦!hZX.q>mb:_ړ4( 03v/%{f6y Ը::~6xt ]YSc[2Muu܄K D(thN}b3$X祖ta`icJVFȕ#j Rb+~b1&ѵ\oF,Jiߥ։Ďo'Pst=_"pshz>!Wv^0c<@zmiV%OcBfXT`/{1Oy!_PqBט?yws).oo&`>;-YJ Ljv&JwHKEu0Xp*dHTĄ3!/Q J6 Nx)3Eeܢ|?XGB'T"*XkN.+mkeIC!A=]G{w4zi0g?g߃\ dib<4H~m]_~/ZZa3 tǾ5491`.ysGp$5QgnrE6 =3TQM;Y ^C%@vNx wHxde1,ӰǒY?R *H1cK& JiAn bctckM#5vM u(.;@.B=+4d?~XVЌ6W&]栻e2v~HGpX)tc-st&1mV \-6\_3="Ub1кQ@wܥW񜙊UԃL UK̔!qg )/,=.@Z)Br >_8_7!+֋?&\.٥xlQYwM|$4H V5~㟓7V7=x -y @AK̆DP}_"g\RLh).>ѽH_2(hDfci)_~ bš_9q5^/FMGԍuW } ^(_n+=Eb;ygGaM:ʤMs QyqB2%#W?V5^"pNĪx9*SaKqs(2<2K!zC  UY$ܿu\]Ps'2 9 t~"f 6bƣC89rjnK[3W).E S*S[!HraYK?=NkN-2;ϱBYA/`½, =Z\wa.sc0[T,n)"4ec>xk3JO+?7 KTn<2/j\M5BJdt~`ԹUx&A9ˏTW!*̐ߙ.j/g-S1:@.", rR 9=鬉@5xh.X&0"K]"R㏫NJ 9% K>#atr@msԓQM܈ Q RTܳKR: d)@kAiyXC0%2wul!1r-iTYen:*}U(Wna: )Jv{{m|K@IcDhI$hAKLL۳mãH8wTD[!h@G1vF5 jmG2H.Eq@ }aO_bOc`ԻJxn40 qJ27Gp¤h^#q2'p:R"E:Gvbṝ13əA5ObypÄdBfuLr=1t~: %ie /j һfԀu%dpwTrJsїpj4#i١]]Nr32ǠO$M'j͚fNE4j>}|Z,׉2zkp#9~+iKTjEdoKXc6e،ag%|*@-&ss>BKNg\/f?R9JkaKU!V%Ϝfua#(|q&AwmO⥚B FZt!yPp4uйnӓZLo$՜'a7ZzVҹt8鹵vNkiIyxIJQ6_ (: 5 X%Hy̎soj%鏗&XhwɎ&fy>{G"M2A a-ƨnǣgSU 9iyud8IzBcHX`8r~6S$hq_K1PA5-C/y|m3Ir \/䘡1$ކ1xx9g'H@&^#57Nz:@ɾxF [͹B(H^kOJ#)=9sNJFfsrR>mB'hvgS:)C6BT2N]JIj>>nwP#(m8㠑n4Yw@GGRm-1@P ThZie@}s:RXO!MHD!թIs[M6T!U:L' :@Za WNIr,&Vj*\bJŒ ƴ#{ 5jM8rsN/]`v72+%}O&uҨ?gS}M:9C>u&x9f7G˛4=14 hȋQVm@} O>f_|t;'OM` E=xum#B)Bx_9ropSJdٟ%MACֿh0 ŕn>x;N D&_nͺͻu#DfbCo>rU6d@#:XRaԯ4RFO^k,;QR#Z\ˋe~^Nm)!p8;G$㇗D,&oׁoq?w,[}gۻ%Brރ ~&A8h_~;~|İe V(37o\0fbeQAbeu{XBǛ?.;A:ձmI[9?+Z:6`%zqȀvpPUۋ_?$Lbm0w6Qr)U"zkr?-%0ܩvLwL<Gh%>9uWawPAh[,Gh0uCGzpֱuBi|cpc7Qa?Gɩt?*"md/$_8b`G-ݭ8xB,nfA 㯃THZ9C e??Ě(NYzgӳ50Кk0nKe3RLDZ}3AN8;fDS؆6VNʱLʥ-$-M݋?#*i7v=m#& fsdXTdL I۸}B#f} yCz&? 9o!)|I?J պM\o1#ֺI[s;R"H#Z(_ >KNk›dyž}6ӑP]23 ` 1iV$5nqw@(s1LTkӝnM'va=0#=uAoy= &(ѵ2p7#uѴ }/x'F%gC 0TnHח8-i*wWE/^·+Z w_CtJcMD(,,I1 BK!# $MԠi='4cg )k8"re!fE,xw#D̯]GYhVsסS0~bEl`lwIS-1Ne16a<*i#sۖ2~8>v]ˆE 3' ZbPP6L&) Jbq҄G Zqe@mooR{.LTziW}uݭh.:kVJ16s8״`xm(9 ~Ev~ #[蝌9 y54UkfY)GCw`>R x`|?}['.|}U]pGi!._$tp僿8j"BL"&`W?~׊WvT`gl4107Z]o\I15;V¿DzPPpgh{҈g?*AKPt[?#̡e5l+hO,RЗ|Ү׵JJJjuTzGX4n6f>\1马Wx=!ޔey/6ƥQU|dD;ZfԶw< \wwj` 뎧B7SE%|K Y2dpE=o܄oR:W*^Wݜ!k]bnmsHݥߊ+W<]DZ2YZBpFDaIʔCl,B [ 4H!)1Ʉri*0&5sRFo%PQ?aһԱ֒k-7ﻑ.eReUܪ$?}XSU?½Ww[f> Xx+|<+srkf?LJkeLoflzv䞚t$7@?ph`֍sWX |QbQLj'q(GHhY Ǒ,3J<}v"?&0 ѿ+I )-ʶH2OyȤ 2Jc10FZD&85`*l-#1$p,4|H8!s!t$xJsy~/KMa$8tc!Ռ0J9n,(Q=IS6]ɹ-G̓ X%=MB5*ˮZz{:1#@:7bnwE؜j\y5>bVT9CwG\1t:Y;8 Plũr9+TXnȽ_I26ObW9cSdS,P +\+p{'sEβEVE߾ԚA[#P b>wÌ9_xKR$1`<&6p&&.vEe/>WRٍ :H $l#PG_`]sǐ󙳝s4] X-\+osqDn^:Kp$C)EJRhy XЯI %ĺR[.=/}ggAVg:mfL .WFV69թ4*">Oo7Ls .̯x~Bjh}@@.6tv<8,$Tsw p&/7_gos% [ڦ <㘐'L'\c=@g݇-Gb72ٶ/rMnM(D`)š)YPG61 2Rff嬡-~2sACT<[17~6Of GMg{nVF%xS?1u%qRU7M]]̡ 隙lȝ MUx6eH6ŢgChjfȳ%]?̭2S _Z}+`N'5aB==m&0Sd_IœӪ%Ix*DBE38:UZD\G!i5ƈ1C͐)QLDE6*C"2$%Hs؊0˩H"mtc!VD,zX)J}EN+ed^ބ ~wsA2/ֱrl~Ͳחbug}qg%}?pDDhX֡?#W`㇗ OgB |'@.!PzgͲekw ڽF 6rcVoό3F0 {Co\qGs>]RҦM7#$FSC79\;0aFӏVXCړ!߉P|}YH2ҁ~zi$sG\/{]d5+f ौ5\j$VX#p R0XB ؝ޛ+^YX!d48"& qĭd1\"ڌxe 1R ۓK9P? L6 ˆ&7ƣ䶊ԀE=ًZR$>IC7&i#%D5:M@"XhLEbR˭VDqRnE82b QH4êQ!YۛmVIDNlCD(AqlC޵>q#/Ww~Jlek6J6 `lhQ'R[߯1$!EZaIYg#AwݍmAAS "p\z VB( %WX0.#5A:GjpD" T+NV& 8c&Ӟ ̗: Ԭ.GW736Bd@S9*ɔ%Š{tq{Q1­mHŌ|tC?U ي Gxb V>޳̊=<6Y#J̇gWGX Xovճ4"|6'Z’`6H<;be/%xSzo6eJ zgd6=#0wFNoDz89_BT\yl@@*{` NNOO=1?497=yF[Ǽ&SExg"few_F ,fJho}?WbNDMOy;l"݉ǨH#C.^d.UH瘍OɣI%-s a?1ƨ5[1yƨ88׬f>5F}4GcnZ1,cRǖKv(Qm\ۮOn=iw4Ӯ5SZwCEhmGyoъyxh턎cOBwWv8>)O%PT͕o 6 P`ܕ tQr jzDY-{'ZL<\6_O{R|SD5]#+'}iP_x/aSf9)~3zʮ'c)˕!tL4{|1GM4{_M'D)BX^m}q2@F;C_PVvtP0= e)]] rS:0ۢMJzizp%".pm^gkWq}3pM)d?.qz==F!pVii0$ B{RZpڬPi}#!?,>;sO9%#B fɕJkؾK= r[x2%kEe ׉~J똓ė!%՜SDak1\8L)As Ӻ$E!!XGJɕDsXZB{j%(YIjT_ KnXZ*m b\*A$F . T>)J|w#)'X&.Kg`I ,,8$ #v,A݃-U$PG âM9MQb8YRgR@4˰D@VqXTpr)jyFi)e l f!ffTi* ,DhQ+SJR+ooL pMJW%Kpe I;]%+|}XpdQ$@;RFk{˼BPlr3Kv >ANk oGVR;K%'ҤdfKƚGs\Yɨހou7MI>SچB- Vir>n&M`@]͞MAuPooS*~:ƚЫrW`lnr~w]0qI5߄ܮ[ӿ*n1y/o=kU v_ՠ\wʤ!O\EtE FuRhb\sI^߿ں%3krhW$⺝uKlOZN5!WJ5?}C#7UBk0߂0f۳) BOYKN$pӶNQ,Y VT8akשRaZ(өx΢ҩ@b4ҩnV{DݩpX~Z3S]8T8b[Tdא^Kv'%L, Y!eɘ1zF "0$ EcVb'NhDޛNwOLtL:CJ4KA)ƥ1{ U£R 𼅖Q2"]BFVc+ zu+aiGɹ !2|!RQ%HAVFMvۆcL%,AuݶU4I}xϺ+[ bT'u&m`Z?ҵuKf4ֺА'):%bк{?R2rc4݆[RLk :R2hW$-[ T qˈӈT'ʲth]VW24Ӝ 2fe0^ iC0F;]lh=pP+"j(Gۮe9=uӀA bmM`#A y*S޵Z7892Q1X¥ohuˡ!O\EStb>u؈KA$Ⱥ [ޒn4䉫hNіEw+Glږ RF0 swFYE(ס8+^q'uZK2¨%Ո eڂ > I\ \ܣZ8 Rec\RhԏDu EU4I('g(E *$Ѩpl2E-2hW$x}&k!% S ٥2}.u JB"!@'{u%3Hӿ.N?AV^M> o=30,xnodq]=oޯVׯ^(WJj7p33djeqP@X͕˰>tQN|"dV+/F!bo> nl+#SȾMkZI :żDK\ "l}~vbjd& ^۫5Ԟ%gip`̻l*/:CNgpKk!,q).P8X PzV:g-(MI [kTh9#ߑ|nMMReg<}6nôu몣*Gu1w`Gb%wg=HV5?T ⱸǂSk;ָ&Uepgn"yDBgwr y|5g/ab9[uZBlc?E2tP (]Qڼ侘 EVGoa 0r{ I*`ab_<xA亽PNTh͛Zo+$qqVv{^no_왵)~w od/ey.f[o~ \]} 5}HdKHb:t IZ+Hzf$$,i֋~͏]o4P(O'}3qZ t9Ѷz(f0@际#ԊTO2(g%LzHP̔W4S4|) ]}K2=:їt4W㥰2axe%tӕuPy49РKFf .Վ*n1Y+W,>mN G /&]|Lŕy ,H˹/T_' #~7sKO"46"\'bWESS[FHr 0?ԅq9'oo&Y99Y+YllnlN֨0SL02/4"!'v=;%ob^^e4ޢޠ}x @'x`J`Zӑ5_=@Z9q՜wJ2Kk9C3&];mW} Zq*빷"\y=&ܹ,nW?D6NBkkz~y9/7C 2 /7c'0ϒG0r01|TzG! Ay[ }7`M33H1oLwZ1ZD'[PN-T;o7E2PfC "J#G7V)gEe!C)f:ŦSOj*.KKo@INqo!tMbնlYT7#C5 (XNG-_ +ZS=۰p2S4X4.b:BO=Yw}W¡"ejg}UT"uRˎ%XH|y(CH 11PsMT@VhBk#8UtG6$Fm R-J+IuS.$ۺ( a{$B99`B\ (hE! )%G`:EIY7( V2Dv]Nr>N[zm JÐ1QHuOb;kD,0 D9P!U?t?3eߧH(8HK~u8 QI_Ai20`pFb2`}hx*6PC sR0k yNi FL*!GaPAHo)&9챚`ȴdϡ'0%5>3D`2ֹ\{{;Pek#e[=g`};95Ϝ-.wXc.V\wg-B.@!ʝh1i jjf ZSd܃p9BVz~Bi_朠}GPuCq:T0bՐƥx#~'x"D?U]",(/] u323 ,niKO`w1Kau!Eit ;I̩ϛ-Zc.|]$%KrzSIj>pOgkt%%ޯ{-`InIXQX_I+ODQ^Y9iΈ@'\?[ *>psc.ze\TO[$X=,Vyq}&mju:[ҁ Rco?Diz1<i3^FC{ eVY!r@siE4ׁpF~ qA.(g@K3ÜWZ |֥ufNYk>@(y4Zjv#EKOit^@P}J dH)04;UnAh,@T<#%1j9"-bG2DZlQvĨ5 ,1|T6*h#m1e?yK1D˃Q;yeEp"Ú+A(:1ϳ1vi#檧ccMIJn6F ̟%mF#3Jt_g6j ł`!,1f!s401>:@ rI r8x+9XFM`z.ID`8i01pII#EiL҄=aA zF5mHS^9AΝVpcr;lAA1⸏y12HSҜAQs+rpNhGTr Ic܆Q9*paw*`(,]"Ze<1_.%ܟ}(#w[n*|o{KP ߓ~ ) oˏkT]\DD?`EߞM^,Wkn|N'(÷G=tpj5CzujL&Z>0;_1yA,Y&];Khk^IhÅNkBWcĎ5wpŎS n%ȝlG4YX+H(a1W CAO EFH8"\{'c <(xt8 d"8# h(!cA,S7.PxN{VY "1qm|03 fR|t+6l ZaB&E:]5mFW 3'g 5Y_7̈́N ~,tTT1,&f`mE1GeI3Gnu$9?6a3#%cqOlqk q#R6#|btkݓ&nP2F &b8_ncl<竿Griw;:pS~|f Jk DڎQR'&>W:YAl]A B_^򐿼wnO?6jv ).*+@|Y Pb=钨-&!ƺRz:m\.YY7DpxX2)yDmE|ʬŪR^}d7wACG Xc2ןݐҽq"@p:}0σ6T!O2͌`DbcܕCR^2\X} J@@wɽ01NWƤ! ѝ N$;=HqL/ lΕX0 z2VHalJRkxFn|H"d,Xue_ll W,_ s% ʵ4Ȥ$GqGav6so AͭpnMaS2n vTn7͗6w- WJ᪆1XF614kN=QI)1p%rdB+U&u":6| ȥw Uhv}FY,BVxzv&Wuz4}*3WabOKDK λ;`H#cn7@0g 3Ä%, Oovƫ"&[ nbˢSW1yaeu _ʪ)j4}6kf n [Xܾ]vuskju9F/~Ve&f:ZٛEF6owMqvSByrd]){PL¥ktgwԴF)@J')_?fC` {WD8L<&@"wnv:)"2 c%JDz{ 8sYʭ*r.?K,ύRHvti [1^:~8-ձ̭Ts9'-f~Ә MlpJ]"M! JxAs"*4׏2`""DE,QC3Θ(3HY)#")oa l]+6t6tS%yo!/AN﹑r_fs>%Vݽ}SH 3 硌Ҹn7 9̖c- GAgH0丷pNpf%<!y6Qٟ=_Tz3n(30@>NcaՆg~<+Q{{ekظR,y_U m8Zk ȯܗ_7@mZD-AC@(Xr9rJGHLwN9HƼ샣HKz/kz`rvH_nTPUͭ l}D&TTw $*șleLeeVzRK@3gÕUe8CL!"厮kJ9yDF-w=U#yp%NO%B*e4~c/PeDYM[y`I:ۈkp_9yp41!,B rر u`0Y;h`;1oQZPr 7yDļY"͚Bv'ڮ&B C_YE~Et9ow&A(@U͌bsdy痚0ҔRTc.Qͽ jW~1 B'ژ쏶G[!(v"lONbw$B:-iR<.iS #8||!mlLKn _aLQWtpZK-~wc:׌i'1Rs܏VHђYÕd"ԎrKP2Ja| N"e`7XQL)2{WXgږrJZHXUYA a&M G`%ؒkAU$NOOr)bC9nnŧ@docw</ S)Y#UD>:>⏻#Z3@;mye$Zt9Z`r`ӑX~)Lc~(̢`^VZrSں5lҘFM HobKYl*zlgTM+EꉮhC `>)eb;Z5-)(7%ӾwpI/B UvQEVI"qzمCQ1/l !fLon9^i^)J4j#l7 ^ ra'SBٸ(r]ZRb/d^.YWC⵼?A~@Wr&9Lzpo독v󅫼/S7;mnÎvv? Q Y1VVD'$DY­/`UL b CB%aKJ5W>fbv;gjG?\^@[|nٛWXysDmr !p=[np<)=|VYXSJ*Qƅ6* gOci&1@Y:-%Gj;*}7H|;=XQWޞKɇed/dL!)"גzG+K"޸+(-EñJ;iKz${e` EMB[!*~;%{gNC~!n2@-Қ8Mr%%A~JZ04@KbH,pJxƫr"TѠ- XԘWBST2T`8UN2A!(_6D[R~Bp*7'>l㿹7iLЈRփ ӑ\{A飜VKKVdj%}RbN"$ŋ9i[KC Q?/}z]Ay=9 -& 3q_0R؏˞^z6(j#{L%WJ=Q)/CbcZ+)Fdۦ ̀ėip:ʙ/;G-GW!Bf 6cˡǾh? 7#Bz^֢yϏ_%Ho/ V yo$^" k$s,), txSTZxɢ})4k1L[Q])i,aR'chni)!ڻ5H*1SU]byEJ)ŵ[jGIn/B"B{W. Y/>)ĊNFу\)rB`Mb)*iq7ṈQ)lKb2Ei;K2rD#A !vL4-&[S殌GR\ ?~*Ya[F!rsgOx>IWY:^Tؤat0yMer.6Z7{6;7ZHkusdMK]pͶv"B?/Oe^1Fx"Sۑ :]-Cd8><^"=KGyD ij/Zhtm^j6zd(pmξ> h;^[_Jcrb]gGEHp 2` ov Su,{_3̜.\Gƃ>)BUޢX; t*ˋOo7 /"{0Z^[C%S4O/'4ET{P?E"{̽]+/ AhϫSz<@نUEr]%(kurLl>7Egf|6:dv$Yd_]/o_4'XFۗpPޣfR|9?NcH ̈́z6OޣC"d{m>~ɬ(L{oļ}90\u"G14q1 SfEʥ̖hF`?.;Pvffl2+9# ۹im{cL%HC)<ܺw"+HSVzZ4Xo;u_Jp/S8-{,ytA}bxpI県`*= Q+U(볯cƫQ5"F<>0УU ]QH7ՠn\ z_=VN"QGiyr  4=j|08;޺9%43ûmH1Qz3Jp+g13p`7k |Y?Ό¸!R~jȬLm7=eb;Nzч c\.bH۝knǐb2i8EE0%<=%F'OA8;%=:Z'#'|49ʱǦMxi;c+]{lC&5̍'AʎU~4"RR,I5n6 [? r8M9ym2MP#im䡫i)"l?VlFJ|'SvT`y&*ɀ!-7$0np*G'矴Lbv&ɏ^\#S=c3Vj*p|*5@/'us n1@f= "@K:\{j'oDrM5v#D*sQ-xs~+`*ӖBvĦ8aM@NIN4AzErHdv'k'vK'+- Cr Jyd++`94,٫7dT7<+ OP ӪgAdzϝ2i焃:+$}#5;uV0h,qV` SշO@ KpNAD'omsˠ@?lΎ +yV0:pl u~BQ>gU`tCy їS,EI7[0}E7\emkOc@!4hٴ?^uq޵q$2˱φb/68 Z)K} OF}2TOx 6,TW}unx9)fSʼn$~gƅ r1GK@hQN-)#PR\ډWflޅJ0Q$M.!pJP ~4o^bô   unfII( 2ԸHUD 4E:F"̰ '>zqNy3Y l6Q @zi`noJS. BdP= @?e̢ ~>>FƊ6&*e<)W!a ju|09HR| wF- h48S&* `jm7*B=9 <0H!Wʙ\)n vdǪ O7d2arγʭ5I'q^\Ϯ9x:Lyީ) b?*|Z|g.Y5 (ȎPP;nb?]Xmk_vp?/;Ub=MuC@[T /n( U89|Ur29y;Zvn6O@f1!H R0PU:xrLb"F8XINط 6?zZ.zI.Ǿ]61]6161@<8$l 'xUV[1|S"c Em5跪7ࢷD~ vq`bBЃDZbE7.mҴqv#=ظX"}qq}ŒЃ%=|`b7.Ll\\?bŲv`~őZ3a8EO{a+)%^h5bH"r;d͐pbu}Rp\պb-6.V-.VHl\0;ظXa}q"Pb.k;Hj1*TiB# }fq[]ȃv X}{@r  u%`,.eBDfBKe"˲ .k_ _[EXP7{*L]&o8E\(`l,l @!<3 !U 8Nc?-ng,)O:N70ނܞL"ݟFiz} {;|n}J.5QΗk!LiwcnuuA>cA>(S7fTZ6r\{-&˃&}:m2}D!Ƭ[~'ZCBYRj7rrX7˃&}:m2}Pts3^ӆ֭ 8D0Z7*w"vA>czLZB6fZ6rA;_{h$e`ry:XǺmn 2mlK-HCB9RH|/Ƒ`ry:XǺmd6kh@ȁC4S =nTD˃F)gUU^ m.iU8D0>nhry:XǺm_d673_кhM v-2Vnsn64wk!,LQkRMw}k`A_Cd]tݪ&wTk&~ٮ_UM|%!z+Ԅ#틠%ZdzZՄ Z⮿iU{'ViSl J7 ۻLI&sWcnUWcVZˮ՘ٿw5ܪ&PJƬ]1 }1c]1 Srj@5^/]]Mrs]1 }1z]j S^JBtWcjmj&l2]1 X@P1w5V5A1Z՘[ՄzžԘ1"՘B޷3Pͱj]UM _ ΘAԘhjDop|5f0ݿ3%\՘Ɯ&WcҮ՘[ؽnYcސ՘[՘1w5658f]1 1oƮ|5f[I("ћcb2O^8q>qdK@}a,~ @ LcI[oyd0;Jpp BI"Ō /feɩ) |"+.Rk DŽ@wr5DEam"?rXk- w,Xh "`.¢P0c^Fa!K B D "A*84ʂĔcARG,!BgƁC#3Y,$'"#HSPpB$D94аv~* e X@s ;I c JQ`QHi$lJGZ*(V(2YX kvhaA-!{PmM !58s>DqCEV YҨv芅&UWQH|6 &si1M/߁{g;ߦ(Tp!Hp)/Ozf6 `d>hrpHsB0(!u(2?d%ﭦ7'G04ppNNҖOo}(Ř/B黙jν^LE93}"B€^t.U.bfdߨLכos"*$83|9;W`^0/}gi'=#ڿuLPg"gre4_/FI>_q? eLV i1VI JoW'>"0e^єU{^=%BYt? U,DM]TCԯp\H$q@L+.}7Ỿ7j2en3@_Ը0ؽ+HJbOZ۵Tbɢ.ʓޏ,{e9= 0Gc\3*/Sjͨ(ݴ{EW,f!.J9طL0%-o jxgVẁ: VR((H}#xK%̓H-7$d~+b Ũ\&9A2M,j& ~zДeEåL46RHPeZbibQR3q2 Q-V]1?NU~W͛W+bUig33LH^x Y#͋dVynvR_)\]D# bP=]3!?n_o~ &7'JR0LoLߟI,fLQlj9% fAr!I\pcF +梡HtF" LRȔ|׌!APqkpT`4/'܌|-ΌdRbx Qb>q ko#ը ͲD8g`BBt)_ ?7UMOo| [c>;ZXdv&WnVw?8"0ɜ&#e$1K0L\:`Hs/`.[G<^v`ʰY$01a I XxJq{̮}~lBMzn!ޠ>?t~8k&-;DlN|(,v½+\&U:.\*APvy_DImʹ3;QIjT[p#wa%Y 25|;- FhYzmR*|Z`|b#X#v&嬨Os=֔)Cw[rߎߏ'Ƿg*}޵q$З{snB?-%RN8$fgzdL=G(yyI.[r㲖Ǘ9"y)$?GX8pQ%W#eHZep&%tyd=P\&ѷhqUEz2:sw^Ousnǣ0ї#W`yrqVHcydrLm-?!Um`kL&IcKzsP4-p=^o"Ǜu&mR}Pe4,jwKj,2ve )B I 3 ǸNG aeԮ0d8kȼZ1yGN.+=;NʧVAQSH%`&XH Bupe XE6yf F)d+[ OLY FX4_?Y 2!C<w0 OmF\6a.$K<{d~yx=Ygg->ݵs<$0ŢoP{1!'k>rcD#ܽ?<9^S`t| W'|(p2(Qg0'xƻ~0+xC+H?'sh/h8[pƄ %ɟ?5hݢ`ǓǞ)>c4vv&Kw3ϬK8GUܦpxZH(\f7:_#6\W 庻j'1ǹIAݧW[4)ھ˽{iute"6ܮ-~1f'ENKH|*旖WJ e>*ϩiEŞ]+>@KY k>,d1E?A {Ay+$)mn_%g g~:Ό}+kI_[&=uA$ҪkwDd]_U~sݐҋC3W?]ԛI$ް%i nh1QHÑB j6EK'R#8["RC{̭kNJ`'^ʩ`Ei~$5KR(gdBbJ0QAcz{85֌rB(Uy-b̤j{8=q" Z,=JIbm f[Ldc 0vBȜo;z Vf7RɞkWB*w19gT{bZNe2fGقt0"} 6D C"f}M^꒕[?vD*O@6?v%׺'d6 ]nSIf^c ٕLӰt~=%crSr=OemByr|.ƽې (OG$)R{<w?ɠS82`&z 3RRy݆.<#뼮#eQ|N__f,!\y R%!ow}z!Ђ k -amrVmMPofŧAfu8h3ip6bM/6Fh)_()Yu F`Xmfr*cM9 uL=<#`88fOD!g[O>ץVH+ ±U/ch?ӳѱ@ k,0֨@)p+<5 6D͑YBvWA}@, v>Ymm]z,tyMO2} FfX}+7l6F8HvЭ%5$`eC=l}hIꥒg/]>M.ejw?Ζxǣ2kt7oI7.6Θx%%oVqw`0Gvf†j-flsZm~@kdMV}AsRZv6a"tMO`^`\- BJjӘ+ %7,Vӌط'JepznI&;NM79sNA%y `s` %Et[c)##kAD:й&Cs"Ks8e"bLAiERX*-$:Ol4 .뼁=& hg)p}v>ӺfӤL"T}rL6S\F]%}3  `:Bmd%n0ZTX3&lZJaLQ@FT+O kqLb&HI 8$+0l GhP;#a!#<$6je's)k5 ך3&P6"l%Y9#gԆ7g3ுwzTYoUiB4AӔQISWK+g@9,1a@UXenPzNE=my{? i Cn *ѥ/l@S< %lf"1`7(8o# v^͚ Φc{iMJHvj!DRp%޶ Xsgw89f= pKX/4*Èccp(Rk֎1ZX)R.zq^.i0]%ZIծ>c¢Պ O7A)Œq]hAdc\nLpX TS~ ާ#8K{/Wa&Lг 1hAEj8B30cɠJaU+)$i" PNu2"5oT,%M ¥\k^C(=Z㈢K уCv,QpX@odN-&ќ`1em>FlXE,`!}#Q{뚆* ͩb{FaA+ŘP.8rq$YpxC5eQnL1l/8 /`hGTzK+`;J//˚U07/m#욑/gwI7, ۗTϼ|L5,~?e99(D%Kd"XfR 'wI],wr={RPv~kwcPxnϿM:ԡ]mθjw.R#_"TJzV`fZa4}=ߞ}gW9&r7PRpv1f4www\$'I-C|%i/žN-\_kbaޢIR lŀòASޞ>[Q0GhI@` ?"ޞՎ$apH,9;4m`{w蕛G4oFt8zSߧ>wH 3JMCULs`+_S01"]?(D2'cf8^k>6#C$w9kbT n6_G^?{F_{-U| wbc,6vl+əْVKbwKNNlU*YbMLpȌpSfOF$26L2=z Nwxigc%/4$؃lX1 G! c橩Rkito,v\*؎@-J>̺79 :́9J6IHn ndeDE!!~)E TC+/$& Ril%i»'JY p2AgGra 渴}Xo3)K &Vt׀t%fڂ3*KhCcs)`f$zHauCIbJc 짰>,՚Tיw>V  #}gGU]i]hi: _}u̗ h"BG ngL \p!( @2ࣧXs-AXSU&Yi3roMKRޱKVTntM M V(cuW?X>~~>xu^g;_>twOKܾ঑x^q6MEb+} `qT&|ig;d`׋&PY聓DV/?v:OubEO n\:n 3]8#;P{R9˾ʝ_a%)$x'=I.wCJ)R;e<|DW:0s $Iq&a %XX$!yj%kZj$7,p<>i _ʁSFpR XR/45S^1=B`IYM=Dѐ]7:ˣE€A6.Њ%L[oqw[úkCmEb a|\%w-C)3T EIeFݛ~/sTd_ .z;I%=]U/zFR2wz( $gDk-a.B@|PQ^p%xsuQϯcz+5WhFOv|ɕ=jގA فdA jǻ۶Aɟ"bO 神ǃ)ȁ@ۆ3 }m8dz(GxlgvV$u/>K fIXLf>_xƻ%nhD!H6N"=nv"WJJ-3ix绻&[G慠p_k47,Q.Lx[|\$DpXq`)no}qǣq  =2jc@`Xwکd£cAp߂"W\xIOt,QJ0?ǩYljnP1TU!sM"Φu$?M$?MOֹ`T ǠV Xb?ũN9H"Ar܁Z~* @؇@X;i$>ܝj }`tSp:f+.cq9QtGX$:yġ{lP ̔ *q9q{Kpp"8x ZWGҊ?Qks彫(q E$)2 {f, 砧z t28%{ hLǠC3tDvתZr3#tS=%y22I ).@+~-(ެbq*ml&|of76S)؇k׫]st䎱<2U|E0Q~6a>(8WO:~`&O{Z$eoW&޻eMiG8lO L=:ݨ#7SJ)%cvVtk(^BzxʽgypzO;jͰO/sޗS&hB"U !a<@F\zzp&G(Q:FHP;?ԢQ2a9wwi&(j->v 3O:(g]Q-"̖~dM/y.saNbx/9AO1UzUjBJP%%&)n5y┆/s•d;NiaXk'NJ3|,: (.iߖn6^?=Q_@> bΏC+ 0u7W^ ZԳMŸя? K|\CU} N9?%cIplT1ױq I0kИ"$_F5=݅WEv.h,ru"e 4^|5ɰRԅL^UQ.8Ly7e@2'-b's%Hs%V@ ǻY}IntsSO~Ī8wWžV[,wSƗfz9s0] SvJ0728q;s5ȑu &vʉT;z] -wam@-<݁Z \Pwt~eߞ"MS e)ϦRų:8=kMP8V1g Q8P.9zCNj@h0D*tV/lsu1 dF'dVWQO}:64Ӎ}πP'Vr:f.@Wu;} 6&vc8ƿ/{ȠR,pna}-8FpGOA"5sXa-Ei*0# 1slrBAI [n}<7%h9%KIn&Pq0M ؛O?Zf֯\]lez$c˾dP].fObs~Jx-Fd~~lFdE ]CS<Ӆ&6;IwDmBV.rHXqo\2DAQ& d'@<)eQj /aA#S-)X{RQ {,I{RMRoXgW/|" hZ+#P[N OD?eIS&u? +FqU^:#I)6$lYxM[D0} RTQ"@@Yr?^r$9^lT/6 M^Ңuv=*$*$*$*u/Oi$W(xAI8J[ХzcK!^^@Y^^G7Vy?%]yaaMm"rlnPb ƒ"JUwe7lE,hS џKJĥ/Ӫ@qp@$7^}ùypW**A@5.KJKp.WsSpOi؋؄b@J&tste"rTrj']&VzeSWj}TITITITI]57jh!]YrE)^k#*iEf JF1-!O@y5|2?ZU'wRN#ۤRsa,pbt,3°Q!L),%}V+(ܽ?7œ;t&s0dJ K?e,e=YUԲF"W΋W38A)+5 $P)- x+CR"\Ѳ` fxhdZ+S#vn I;H5@ ivdXLNctX BP *V|fa,pL w*@0TDj,(?Ӡi"!mG(K %TS:8r7*c|n&y /;?a9+b`l% *lzqԳwe͑F0݈5zصCim? CjxY#&dEU3=QbėDf"Y1 *$髕 ݹU{e~} E?]Mavw0vWz`u$V%Dy/HGWС%GI rH@kR I=qXD^C+y@*zC|n1O0bt \Dj`Tn6[:!(rKMBˁTAiBc^(:iHY+5kéiN&e!9yCyu4=%LKJ4"hO@aLfWV(xl2ZqH-o "qk#箍J@qY%' %"2-q/l ДVVùx!:NLb( L .U߃URu* ײT pj.pi-G2z%މhh#z/=bRu8psqz<:HIA)*M%y|]/}ulܯ'ճ©l2+0 l_"L]z[v.Efw$X,D Npe璱9Ɠ&9=fb1E!\bd%fpRp97vIi5o/~C>o9&G<[2`n/= Fjgpf*&0PW„NjrEc X_v8 [:~C cޔLB{0aGspŦL653e7Z`ݥYVeWue;Kǚ=jUCcM Tر EJ%C '~JUqL+ʠW I@Nf3wYLVge?5v%k׆ w?fW pxYg!N `d/hP2^uw#RyN{`&B.~:.`<ԟ%5k4H J)Ml$|"I;M ؗv^ !FavU!E2ȼ`b\MswG=࿹U/n(* 1 &EケdxIiq䩯sj8y(G,^g=/aeHjr~I)ڷ]yA_V j_ -AzE~y!vׄ{`k3}yJ5Ao:16]E:|˾y8= H 1dʳZ1RӋ%8a$r5qY_)ݿޮK oբj,河cWMζx,>sљ{aF#|ݓ`4fx.O⥆zd>̜f55ޛOv>/5=kg8z!Ϻ>[kskJ4B/7c7LDuo+83AO>V5Z_'Jm-^\Ee-Ԭ`0wy[=T{NRKw4dP =D/)5R f)~3]]hF Бyp\`0V)cnޗ$BUBߟ'%I$Iw`EQTɤ|.Eh+̙\X{߮b1#_!O912L;d1=wA+mjJS-w. alã{n AG Nr;GC: is̭tZB"J˕3l4LNު۪N48S'sO/mOrֈ[')__奟LEh"F/rh"F/ڡ~vu7pjy#dq' '"P>~-JۛE4ZjpV7gl@`eWͿ7z Bk tќn-(B9 l7҄ dq:ǂ Th$e 3GhR^S+bL6梐jr1rzmՒȞmIJPX#0R\Ds h3XxhyͅJ!K ԡ\E崵C759zE4(Rn. $ LSeS4:5`"I^QX`˟;Z->E`J6BK9؀{IQ*= 'i_T:ZҁnkU䩼fU-KAI*BTZ5RTry<"W.r_sU߹NO-V~?~P&[oe~n)73sfsf5,ߜt=#Mk{۫-~> `6<r}D~W> cR!9## $w/R޾溯40ƀ"dG\ ]ǭoE?/yѶ?ѣ[e&M^ n"4h*)( z!,K# 0>|E:d8)|@Q8ҎW,Lvϔ6(*9nk|Ke"D;HQ1)5x0!fb4LEi,:0Y (D'?p^(ht"эA|+11]uI_c:Ph>f\&7bl>X}?ZBRvM-|-7S4,0@‘NR!ųp OyLs(l1rVZ$]$4!$,ZLYM($q6ߜ+C7 /2 Ev5dZrR/(a)QOqIp -D`$ G-s7Poξ%!Yp>o3]9nUr&>!<2?1MwU&oDPj!{AuqylUMV:0OhfE  |,#:P>R2Q&03buoJH)#.LWTQ($1I5ǐJ,yUrЧ*+bJcK!E'ᰞ::K˵c#O؎Iu0Z) FᅔNQ>;h1HМ/$= cČe(~g!EPz=7‘\gbC[\o6Bi}U4! )ZqĐ,@h8 [3-2GqSR"ts҆Z8(|ʼWژ)jOA Na Ef//d Jհ %Snj-\b(1qjkT@Ω@Xd~S.Zz*mt wx{l#yxwr)#W-1%Fn;-Of_6AO:2yF[CЉ^hi3%hyTLC/Wfbm?gBYrDK賻S?{ lpW/Cs|qcSB[RczOF'@O'2HJz{ /cٺRŐ15@X5O<ڍG ڭ)'rNABRx/a.вdM[2ze"Eыϲ"`T,Z pPyrD:\e$k "b}D4AU׵]EƤvHU*TwOsk4A!9N h@ #@1R-=ĕ>gD)P 4.nɧҀ|!t̉έVbjʵP_3q@*- _R=hqI~ou03ΥFNQ@vӘB|3zI`ᣀ%vH^Mdc\Ec Fn]^Q:b$7߾mN$.\&^<&ydgmR|>OIziM%ArH[Heq3#X17y^v[:Xa`_cNֵ"zף|Jzb2{7[an:XIpx !0ף=TmۇdsRU+])2e2^tǎ6,~LgX_e?ܤ{N{ac}tLo6TZɠ?0Yi1ὺsw˛vU(}aֳz|/Q,Or%+k,~9jRKE@#8[ C8qx5@ptzO?"@?a8]>ܧOb\c\(@ES0$Xa$D"Gy[X|L}Kzʇ;#Ýyӈ7RoJ?PL.ACz_տ)A6jM7sX̃E*H_KcE)@XLb>"'")nT,f:a ݭ.~0,BzpPjIg=җQN/#OP1*1'8b5WK,)5*#J wƚb/(!F$e'NZv!{ A{8 ǂ'Bce$l/N8A{:7AH̝w!,T^\u;d/%]SZօ3ր܈SWHDB &x lH`FCā^`{. [zb%uĔ$f6a+Lmj"eegE4ONfKR%vt2KqoH-eB9" 0^30ck<3]9@s~2 $+uhʎ,*WٺVz,6au-3gVYKXt}˧㲨Ҵuó푏 *m G+ngSLt >{9'nNۿLQ_nDU&=˓\E5jpપYVqu 8@JGL8K, ]e + RZY2kϒtD_LeRbbඋ'c"Hj&T-%pNJtCp2F(`-0ۺ$B*f$>GVs):n_w[et ~q~f7;-iz48c8\GWљeW5QA-?s|௅}VV}H4ǁdBs #qpD_±85]0xA*gĆOd%R>J!^]c vrp̓J?'1joEAq|f-3޷Np8{vF8 UqrYF!oH~uNҞu>nB}Obz%UҕPԇt` pSD*Uͫ(WUѬ Bb6Hsϭ1%NHI) 'Tj%7D%r 0..?nJQv )Fg+`30T҃VڀlT bM,;y5ncG\rL \|PK$ dS94^% wCFHF:%v>ƨ2R`ȗ!D Y W86%ؗkIJc|Fk)#jPmT5#b0"pDs/eit8~JY"ِJcR gp ϩ% WR(M0k4HwLZ$"MQSj:֎ j[,H! )UɷaKk!,vSزIUK)5QĴ6L {9vlkEoY\} TYKBJmY$ٲbq6K"8.TKU2-QIc#|% #ホc&Nߊ'Xv;$"T^4<\u;stlqo-OÞy^?Q=ݗ|@cyRdQ`'՟W|Ij}Z^=uon/h{}Mrt|vjA.;pB !{@[JW z]o5QOl22T3S-lȍlu<0%Quz6`4uu획r- 2@U>ހ\-.p.qol1*qyETFUPƣyBb85r;9$~Fv K.bfpW6Bܻ猟y犝5%qr⻯13ϢD'+ڛu*, |G eN 7k>X{vGܶ!tJIҠI .\8j,K^X)>\/@POwG837榄jnj\옋 sN0`&K0nI8H)c3QsP]giΉBTJu7)&3%R*oPKB=^hi(WpNRF %;rE s;5(]i. q1Cs m^Y ^f4;?,hL]taQާz̈́37#v:*&./s|~X4:ᓥSP}=@}y5h(dHtfSQ%{V imw]!983//ug~./ ֓gsO ӶU"a%.n)F;9}wt\FKu8*ߖ:z6midbC|pZOpչY8$vuJݻA²SkۍijP5kfb[Bw v.r}qgP<ex%aԒ͡f? 2Y2j9Dhɝ3[ϣYB~0, zp]a7yqYO/oGred\ҾuZ{ yNxPX`D8켓/aif4*.AQ!%pceHާjޭ %HR 8؂bX$AWrfluC _=HGSvz! =H"E4a}]8s9Z=HVBwޱ,Q$.ۺĊȩ;˞˞0:=ZD;2Ts\0HHҔO~O ;aiAQ$x#py@6[b(uaUd"0UMjQÒJ~թXM%O(*ɧ֒u+))'ГvWur=QJt8ز}7c`ɸT,ֿ8.[<(a'iyDtDFa\` F*Ph@8S~1䶤,w&N&.UvQ31N)̛_c ^9m'HY cjb$nExEɽ4؀z9Dz#bCHQoC@9D؁Vb">g!*pH9C}Nz tH`]cOh*TPq*GCH,H ,յ^0d0Q/[ɇ "Ҵ`Ȓ46UoV V-"I@I'FR!%juAHb_!Ӄ!kXv;9A"1߶4QhlrKj2(I7:26 (ӊm&YnX"P]/v̝DT`RVt9=7'yloN\ӫ,4ӸNz85W F8fƽ*Kz)"Ž+׼DTTx9&> ?@ J e6'cR* rRj0Q" 郀3dmaޙC<}I)})ŭw5zjFZMZE= .)ы ;բ)ukq\ gJx2`Q,;PPْr!k3 (E[Kg-ndlj^no;(Bq_ N>/{=+n%Oy◁fq4:t3wjS 5?jLT1dAsJpŁ5n: udG͜S) ,e!(hQYOlFpŽ:()P'd/1,b)x @ *q8hѻWkɏ&N) ܍gÿRBqg9ª7qZf!V؎5 $:}9LlޤILɑ=T%7 YY_< @7vaENJZw szgNZ)~(6~ًid<#aaJcMSG"Fpbq_|AQޛk[jU a;Geu7H@rxm>b# |6e37ɧUqoeP?v[AGp[lܟyТD2/D63|ʺ DRٟh^stJ`mY?dfvk$qlk*qFC$%Rk nNd$UA~_c]"Cx@F;αIM.pDGp(#bICf1 ~TmE3*ҡQAU7Dh6*;f{ɨd@=f^y 8h-V uW0cx(L01 i #~/C1Ե 2VJd]d>dAndGs*OGcw{`*c.LFw^gبm=ʟ%R[G[ Z>?'}xז7h4[w3q]7PﲆHRe,&8I:v$%K\`9: O0BiIq@U *u墎M@]ΗDkٗ:@CūX)R:"^~ \ƕ$ 9zG1<xKy\X_Lf%ƄxlEFS=Vbe]ff%ݜI9^b[aScy Bdњ\!cDbڐW.D9u5LhƆHmhJ䥉i@Ueń:8^agj2f1`@<)J p9Hc`9Xa%%1\Db&4!%ńEe8f:B~:Y-5FTqqF|Xa6YYgIlRXQRXJKP$OV Q,Nl4 jL$QOM96Ԅa"/p-/=y>Lk+sF1bgkAW8m[=^&D'i bcRN5@@_@sR>3ɜ<`SwdYYLҭKCKo Wo}aDj[MY-ʧ ;L`&{k%Ztj=P> g) *D P#ϓ1_Dy,z`]l)EVa KTndր3a?RP](b&| FoK(IΤxRz5&O~ Tp!lC`8Z~]ZHa2Be-NaOq>H=BrHҌ8f9N:2 QC"wKEWћ<&QB4ѼlhvMo4s%,!6G8wh=0ͫ L` Gm(s|zn\C%ֽefo8No;)cbX*-{tGl<{[s[- |t#{7V[<pgP> "|{`+|mհX7֡GSQn(khP@ s%J˦i5pCUu7}f q p˕kzy)KiauBv< A H=;}KzUR)6׷՗n'`UTWD ©﫪̅ejv K&1p# on9-1צHA*tƧbFqrA}9b/g%a Ū4@Sh `|w  .'w*K0ᱳ*c72C#ԋ0JlXei&,hӝU 1ܸsO&KOeP(eV-HQxA^ Fy*=6jZ']еq_/nǺKִef Ogi#.% |/aTڸ\9*ZrbXDI%.e)|CsP »w -kth:$qcHрvc&'4Egrd_ƒ9:vU?րsA8,`ma5lIujՁC+d/ؚkVPjv"ZLco{fGfeE;:TA@`&XjINSR3m8V%ZTʊ@AkP%26uTkÏ8Htµ0!ȼL80-Hp8EX3jvQάQN`dJs-s-`2uu|}!h >MjrM5_6 h0TeI/TM3\|$+@nT NRl(hڔE%/TJ3/7?JI73^΢BM7YMʞo3JsLUHYzmgQH5v2R84p6fNxJu*ŜܱߍTSn$'=I*Vnqm, afؑW=wT&w8{$g JxQJ~x5xU ;oz,S=~l$ "V_8B"fLTIUnpa*X%O-cW!MB]hZKڝ ԡh%Y6&p\9 0;{?>^/nfE{`/߼{ O?<Nwx3v?}wgg}v߽чݟ~}g;v> ѫǟ<2n?gQ_F]3\Yuu%@{"uVZ^;%݃7vz7}-dR+lm:dgq@8apa 0Iic>Lŕi o(]^?_>7/=!_ۓљ)nK&'Lɏ_VsP *YZ<5{h0L S/C }GιyԷ%v=V?Wſd._\yxý_ܤKG1e˂WU緝_I{Ŵs[~-ܯXW{ԤWYw&u>ؐD`Msi5Y 'nx3l_CoeXw5)ul|F֯Qm骑MNWuu o;8jnwo6HލgOݝIo}իd. tǝ=?~?DFs0]6ݷٗ)?z7egx+OSQ\\;Iwʵ5Ż~ Ά@F`Ÿ<zC/Vprݷ L\{ C! ?MmL {5Ep&_;/bqz}OF]+4\PR 㟜W2+Q~֬sx= D# )JRV+'eܾvkoLKjav,a R]: `>SD@a2DN1ض {vZpW30v|Vz|V>-\fMH|v??{Ƒ EQ%@& ٜ@-[sAԐG W]UjV~̓iflքQa a]c#V b#L,\3?PjUP # ,4czÜR%AߴB1t)@rV{SPR[hșҶuLw^1}7rU)>E;#)h)C$cz(9 94'1HtCm*֚CtE&9FdmP9bYL oAp;Y3OEp:!%<*Y%-2yPf) XHŘs>O)\13|ƜϘs>lޚYy!^JW8jyi>&9eƧ /P(kw\Ɯ tCTY,?Fqxg|N Rx#Zz1Gt #Gt=mI["\rUU/Wàkk~VDmxy=h|wϾ!Ϗ4ڃ|`T{_O MJa}qL(M˱ }zŸIne\]l1']"d!}cm } 28{s6u7 J^[c+ffQUp< E4o#'lb83#]džQCD\oc^IJIJ&m,GJ,i 4ۣ;jh̊dE/^QDcv"{W-߭󁌭愣̨ȨX)e, P0% !1;&}Rړ7/ƤcwLn/Y8}kWdV^qD50 ;ƑFM|o \ 8m`rNF g1 $%9&@@H23EdJzO sG88ppۃn%f "kVd͆q콀4'2P0DyÎkWݘ+Stcݘ$/H~v¦,,6R/YiZoxT$j%(=ȁ&y6cɻzT=0;SYB$8QC;D̡Cafe1>!2fR xd6+%‚֕}UYwt&th錳(=prAL.ʬΓ%˚$ȗ !\" , )u#S4B!ѝBϲ^%oז5KT-/!j%cIS:r4>[(lS)$e/L$ I-*ѓ!՚mJIXJyYJN|I,!*sRGCϟOMeY9\4;Tܝ+e$&S 8WoA)l{6bojN-Cb"@:n1+kh9WLSHD!?)@* ^{R < i#) 8钡Hu k2saO ;7ln6uղ@\2P,ڪ$N, RL (^IPKq©pω@r@Kd"3ļ^̴ڍS7#ReBq j,( )PJpZ|MA yzFh<OT_s~<O~< |ã[0U99LUa9̽߈3ܖ"BQP!ƨJgى2VRwlJJeCqAE$B̬0B,khz+'P=!H_ˀ h)D>䎪 z}jEU#P0De!LBD )V$ xnr>]q39Yt O+ՂFJ6e,Z2 s=jѓeqcKq7Č\q> /fcP;cP;cPvb#ZJ˚wF*![UBlU3d{C tQ݀@\͊Ӄ>R"`6Ä"8qY0B >%(>G(P c%܈ ȘMF;{y+\܏܏y?a_`^$I 5AHwQmrpDb-soU8$r3%oQ${ ˼J"v9/ 9^DL̓y,VϰV>BqkpzO?װ}~:?/ZDT/\9JEduTyz?~%\SޫQ=nNhՋ34hѶыٿ~{ㅿ1(@Hkc 5#<;?jMՠ=+iÙZw(y|o {S Yd 9xl>1*٫W'Ҩ9ҤYVRpRJkpZہr Ydr_{l9WClc֭mj1&My CmoNO S *ٵKz:4Y~;4'*#t⒅Jzԋ*Iƅd:rH>I>Qe)[}~o.fԇt^ڞM٤XcœhěN/c"ͮޛxwɿ<=}]xBO[N''I/Hz!!'/w)WZ!i?,^l\Җms*\7hq+o7iyNgrVhC` 78g@/j_vjp\N>~hgLӽyܻ?ihi97p:Պ+@ Jhq9ʲ*SfY*AID;{W;c$4X2f8a,~m0Jblv6( fD[҈6(m p}ӈJ-E݈ˆUEmEQ]Khޮ}]H){ТFQ`}\٤S%(R!B~0 Ƙc0̎eжY > Bmk? yu 6ܜ.]t'AO96l|uP@)MOm5lSlN.\vW\pe7D>WLCȧi$+ݞ|@r- #7ٱRzG1=a@v#48̝;aNjcPojiSi2Zs5\L($F'ptP>Dy6ڼE6Ӿ{vZ<7K~vۡUO6̅^vإi sъ nsBQ0V 2ʾ9;(MDILURGJҩ86HA3͂IvL hfhkK$bR)q4B0&Wb I3bB!^fI87sA4VIr  >FX{?֯n ^7sq[ݘRZϬ̵fhft$gqe\-Yˉ3:Gx0ǐB|' .ƣGrtٍv>[gmwTӓ7T?xh'o4t}r4- HڒkE7P֊ȧEvK9֊.u=A7&:ySff X]%O-O 8nYlO~O3&;)(kPpS`yS(@ǐgDdE9+bE WxV))XJySFmy>@;y.O.'#WxW[K00ؒ} ݡӮV-I19{j!`q[5&*ϴfOJ?+ez]!rt2K$Wd?ˣT^T%'՟yb'wj6w/O.ȹ?Y S NN/gg7Dӗ7&WCO'Ofl')̖(wES4W4qЇȶqxM^ z*ҕUGLrTң=Z)>CRKU?Gu Nn}Bi%V1Tw/%(9<#`@W`#^⊘~B&$A] =zlxpƒj 6AޞaUdG@y9g`ι w?RD1Mpt~Jңҩц}@cK1ڈpP`^P0 #@kJ(qh48{Q3R=!蔣7Mܥ*blf%iybjaI;2 Cpbq|H1 Z uCh43ՇUx) %&(jvS1p2#8HXT{t* C2Ou/eZkmcUb $M&0yL]$&jh&+Q%N7.|=&`@[@ݒISVqbl%53T'JA,?wU,c#˫wO쑋% 7EIڨ>y! ZZƼ׽3bVT ǝ- rx)IM)x!pXRtמX!; ?Zt$C-5TvJh5UF[O-Knzpɢ7,RGoJQ 93"gOr9ӫtYqkd皯7`=OC5,'97 iqkh(MsG h= FaD炿غ^ӫNI[k|>|z0J/tU] ޯYz@]xsuzCV_^h3|dxKų5_._?̓nM#$4[N !6#w奴f'5n5,;1hskaon at'anyIO*`.7oR!鈮^a#%ނ1HsʀtWkF`՜6l*nT̳ H( ][s I/Riog?O^B ו7 ^>\OJ;d## r="JY`T/j'0Fժbp,$7%4[Ū+pk <-8{]D֟9?_n.HuUqx~JSe,f@꧁Lnpb33m;|8U)jc )a Pl$8 |zzY2s[b\ >݇LtȄXdϘΡUw+RDJ] >`R|Fzi$K&FSW|tFQ[O|Іe9}tPB.HڵURJ`eR@I4HzqHJaʔu |$#: ɹ/ D[-яǰRN-ȃxd0?m.ēnH qd%Od4+ō@A'cz$J!~2P`O^HL &7x[`xtLI@;hS !7Y8owI+-B3Ƈm1id[DZ&I==Q+ CaQA8((MtPZpc ?SA6`d>-U8*%t*+jRDX Ao~_Ht*=H,`H+ tZ;(4MN)?:wQF_K9e`y$8.}k=vRlMNϖ9Tcl='V$CQW̦RJ2crEMw-KqF{P ?h8*56jawR EXz=1oR=*%I$xJ ׻bGR8)ǂx\'wʻ3ҳVaˠ%i-b7.4 twh%.YLaK#\ȣd =]5l?:wQ%܀`z^F߳ocnfXN+z4S0k-\!lC5g!#"rig!/KtH|/@t G2U囶笅Fu炓*,N@% Ng%/BsZzL?VN/EtړU2}ʺUy31lXYJ-HN{r|t&ez?"1E7>cg-s߯3px_<Ål00GΧ5(': 7_gePpv!ˎqem缣lyeh)+3.Rh{0{)Q8S7X( ;pD,<]S-BζlX T|cM&Pڍܶ&y7|=æFBn~%ES\!o~|}d.&g=2%IW0!T#4R*/LJ)XW S'J< 5"-!|gإ;N"c_>Q,"g}|Cؔf`C3Jn`%Fm z&leYU":]v>P*dZ~$>Xkun<[@) oϰJa 4.3NwFɨ`fX+I#7?ҤAgGÐ$;XmpPO@Q 5Lz9P1&Ixj٦$$,ƓӃ oϰI޾ z\@0opÙyU/<>؁N @Ӎ& JZ0pD콁~BzԐP*lmHf*HK8JnTay9v b;nDΩTZ'#R>6SP,yoO*bh8dT>xwQ8fRJKIVӥ yw L-)+\@qsp\жx gA #h ]}C*!|hx0:6AeCl+rNu64`pp6Z Sp -L)D┧Ñw l'J4slrF2;J4Wh^z%2zy)b5͐d26!Vhl {$4iL*5s3|׌l'1^ akMPkcs=ߚ_U 3N*1?]yM{PkLd Ez9df=@X|wR73 z YN5iSY tGt9YjW1V'LEs4 v3ޮ-rs{kdK>WmagB1ŌFU܆=#TջB0ӊܑ6uBIgh(Kr*[քnfQ'=%koYGH3$<:l!QaR&QZp{VpQ'dnkž>[!<zqSB4r싗kGES"ۧoߖZ}SuHG_S_9I*wĿAk0f͋?zW\0< 0#6]|C*rYi{FXe{l_JO,Y˒ԁ츨2|Zj',^]\1NDu8_߼X^)ۦA_ױA}x&v d9H?rytcIt;}ECldBMx:2m[]\v{{?a뿿ѿt+ZcOjyi_X/7y/5=8=@9VP6RE-bg b ;B4!UgOu7}e{U?_k} E_[*XWmm;Ww?G_y{"ޞ/{W8`ߕ Ar.ʫK:/ٖl"eP@9i*`џ3V/?}^I3CdT1Iv޺p=h^eZuaP2Cb;Q dDÈ1^Tb Sʋ1*L]b>JΨ&˹q8_Z .Nl{ }rSѱ׭nu-ۦԱ[N`YkAye+`퓂W*3mA:vwv´_.#!nDC6,PEY;n&2*꼰VZQY0/jnX'Jz!xK>BVWk{H&'b#ܐ%9g4"j։-dc _;%G> YܟAtO~{ ȖE%\-dny.;/`s[RhUW,}c|aanWV:he9`XQnSSJE[4G⼸Rpxs,9y$JsŠ;EiR:Y >OF3<#cVc+QcWM6o4I_é y4{72NFp>kHo!o⻷sS<ӯ}mm_($Aq5Ra)ru[9@t/4$ߖJQ$v%׬ϮT8ov)ٕܙWJjl@W&sW~ۺ(jPk1O}^͉5VBx{'}Jjbl,vEme`4 :&|!,tu~whӰph( mDUie!zÕEě"Hc3<;$ AwdtnؗP30'tgC HA ts׃L#I3]ބ)"^%a4ӓCNsHzu*lV +*V )*"P$vzr?B^f *V兖m^@* Mp "UWB0;-Wν-w>'9L~W;x 'I&XDF><ߣt̤҅1F44.4V(gNd03-`MUwjX^!"/xqA+SܕJg:1oF bJ8WȩQ\K) ֤}GqMOҌaO&p4׃Lg$2laMm1{LZ%2CXBeGb4LY>{ r#22>ri,p'F7^Dž^nM7@ċלX/@R|:73Ϙ7h4WCӍs[Dܛ ҆9/ S33HI+y"o`fJ^ $0#53f2)Fzf뺪ݿ?Bt>DwCt>Dw 5ۑsU`y-J JZV yUL2 jG>'oc.9[p*|sad [/HoYj[Q27eP3@J!gr?X3U A2xQRxm?y )ʭ_?lb Fؽ3xXg7, 48 ?ä-sCnCr{!cD*#LmB렶A9ɱ6BG*ϹÇ.u淐sZyTG6C$kH_Goȼ^z#v|w1z|Y++Wa _~L1bDaegos/q}rRʧk|w:1K7 Ͷvx*2cs?d]Ŝu:z~iLbDpi#\[+ϕZAHycw2ٱa<^ScLc>w?Q>g*~H4pUK( BT~GRɜ%у \#?zE!J,ѻEŹ QA?QF&RWK 5K}_™țT.]2{"- r_2 xIl ϿV~w/fb>o/_o//O_P̰%K3~hD a. l%y*]# ŖoC\Id,,1<YKt]PĮ7P:FX1$YAE(ōRMHEώR!URJ‡o:}-qgG*18ϞV8΄OP9C'HZLhΔS/q,JK6W""z-j&dևx֋[/oypHrй(ʚPK%tQ# PȥS:jnUz>hw. AśskhǐC2F2d;hg"i@˄T!RO]A[ImK-N01,QUh}{u)ɰw~㏿ ߚ7Hx~NڕY;k|PQ5 *,k |YirWd̀Bi ~>v G;dVG; v 1<}{x*of{y1E}=BALxƜ"X^, 9S%>a1؏ nN]+(-P+>+^q뿭pK"KŊ1K`)^?fީyy(>7 pL7dQ/L~u:U"e:A7F]ixc̴XDH8RqY*\KfZ_dHbGD6 XKU\*F^)ńlG]Swق*%ЍH (@LRҊiE^@ҡ%{p0i5=ʘðb,"-FM"yo1CA>'wC~oEjg'#YQtߨJ%a~Bf$Fн,,3ft'{Trj3LrŪREXW$E}MbC_<戾%ɍ bPpm& B sH MH4b&l)&ώfRs[Hτ95ESͱ dLHc&|рO JHu.<lP[GҊ(c8ghl if^d#j k,fdGғJ aȧXpm>g0B+s=\tU1*PM)vsg@^%BDX@siU\[lIFu)\SL6sEҞWLsH=K` @Rq 9LRHd"%>VnPIR3N)!IZ~VL%J7JްpH7ETa"xp k|l>6ܖe 'dfZA 3˜S$`VE&8ЅK.^{TA$G3' 9]{_CаmN}!Z×L"Jbz:/Tn#2,r5sœ ӂq!¤6b-v>ȚOC%q.E۠E˽aZMVFr.tqA X/w4'OM,ѐx[ZmO29ȪRDd932&.ȪJ*PUUYȐyYes:3}㻞X 1?^K>eɶ>to%jAýlΎd%6oѐA"=hdRҸAr&U rete~^B@/Rl ng4hSl)Dmy8S JTe47?}ՋieI)%; _F*3' W::{Ծ>lЙ1kT&KG^ bc+[}s DТt_B7SuummBҟE ;᷆^ 3O(|JL2{ՙ$e󂄬WUKkFݪ 9pc|QM{G+ 0¸%J2l~17 B uօ?8uNas+jyA`lhCW̉|(>W04!}E~dv֞N4gGE~ˣͿ;ggL zAq]aqw~ZgȘu<"Os891Cid }oΛUg{08I߸Z?Kٜ >d Y M^GwR9$&R˜ PL ῏>BRE l";O&Y1',o!L̤b 8x 9qq5VA.T[ݎ/S5ZR:bj}$́/ &^:OnPmF}UhWV,gK65 zft%lqT"Bi_uWu˶$?{WƭA8>|v#N[@ҋ,)Z)xl%QZ.f83dnHܝ'} r'0\nrmZwq=+ ck} 6oV /bKE|}Hx-0]c%K+Sw;sw!$l@w*ƈ?LXnov'SزM`Bބ>ӋHS)nuN9Y];̓e`CU\+ Lv{7VMM^ҵ߉ v-ܒxQBZHUgNJov]8-ћF}=|Hn 0esպ2U\v_UE KQkDڎHR 'Q)7pVv-5E'6-+0sRrȽy.ʖ\"=w߱[Vcc 깐mު*xܲW`1B4t8 X'ˬnUj׬'!\= >?eV+q`~17}@۫+o'yq~VX‡%u|= _0XQuK;]9Ӕ<;)('[sF7h < {~.C}W obٙWbmq]J%|7J=z7iqV"=|BSR;&׃#o")6[rZc$~I}?1΍`<<N__a0Bn; ,vuA>?Z{P!@vҘpiROW<{ӳ~3o]褼#p:\:xve줮#$Ȫol';xҎeĒbBx_T @4, RXtn^={Cɫ4>QNߝ ;d Pç׾p^Ok~ 2Ew|%UN?Ed,=OCd>Z ǫB#B-ƽ.[sB{߆I瀗МwB!Hko'?=P\}U ʻMZm{:uS@N-`ѾG1(*~1'^Bb(S,ݛI/K+EOj wQZ?ɧ^^^_@|l@ Q1,$\I~zR~z@;Vx[h)4Ͻ+8yO~5dE&ٯc'P<5)R_O9Of8ŤgPfv *#O{yU'|?|Qs<;hsO|4Aa?|JA(=_b@>Lz㽗Ќ2y_rk?kTZxI8+{VBiX)+7&L„kt!:IJSJX9ܔ!M( % "a*?M3Vc!0laQV0Z(iŸFvٷ8ȳuCxlSfhD#ܦqg?`9߱9VatجExyiWLRdlT1 V\dF=֘;~_ ܺ_ri0{lci#_%ul3򾫯j@>,RV?bR!q7ތw/ &`> -x29SL)2x/V9*k娖/59\/xthwAoQӭbwϚ^ mB݊}B%bh}^*ʄtQw|R<#/g{\_Z Z#zprnDSDS7sB?]HQ:鳼[MWYs^hƶ|ܸ:=uBH~փ>;LR}3^&g4\Amm"i`GA7;N=i(a4VQD2 cb t$G2p+-Ul 6B曆?gJ!jy<%ƸKKW]WAxi5( d13mfm|;^,6H1OQPϘq§!lplJ&cl&ZCl@YY>VTK>V^C>cxcv[;֎ilileڧ8)ħXsAu6̙BEU[Q-[} N.g@M}ч}ч]* ,k"H:E X40$\D0Bc*%(`8֭֭EWp&JW+"];iGHhpКn։{g)R%2FP)c}':DuCYPT+$@kX?Zr2ZZ7#pc)l#F:(dfi@(:t[F2URtb ëfd{!|Sfl 6:2%ݲoU;:+v;PBҚq}&f/e N$h}k¹Ʀ f`84ls+nNSkj2ՒKVfcE7#em)[:bJ`+0 $]bBO0-huCyau,NR&mXE!!pD,SЧ֭GKܐVWԪV'x~A*AvV b Q訅Ŧ~7&qқIo'Iy[՝&Ba֩X\0!*v^5Z ) !I XD"N(Q کaGA`V(+l}*A^Y[(5eidI`6X X `ݹPt2I\LjeR^ZIm[/Zpvۉ]e,eyl0[ = $yIIBBl!d j#Ok&(rѐux1E:RuÙ턴#1}z4%E;ޱ˳z{`2%SK03Mr@t@u=ׂ&d䯓h9TR "?~|"]w麸QPt2Qy¬d6c^&rWcP]:hq%fw33wz{oo͂+-ZV\qR }E[ b_Rϔ5__IDo0֑sGhTY|L1 RF)6Z۱AքWx1t|@1".F%r`{#Xmb#X)suQg/:w~P1x1V [Z=AGɍ#14c*L AL0GkFcR`F\mϕS EBr}/QqAbt9#eu=GCRL8K$!r ‹fVsϑ@91VPT*#8KN%G&H )#SdL*S[LCM#Nj1lA+Y$*wHuFn* Ch]i$1?gLeMȥW Pv7A ;CФwcVdLxpe!V&7˴ -rt<(A[')ڜ|I-,21g#c 3Nâ՞Z>4gpdeX$3!sA(s,YdyfPPT`,yLh Z9AKONa8,RN sdZ2}"s B F ~x_ $ dMֹSvbY2嚛KUҫ0@ lqsY֔v57 fÂ^\бvQnʺd#%&n[UBXyܡ[;eZn%la pAk9+ J 6AKQ'nXrEq2P'nX57rUAh!n,ݘ9O8 G+,Gɒ\F.x \")\(R/îdJN1RZeFhy? ΤU{6cM QY@:okUi8KWBJ5SV bLH $´E@U$Fׅ1 \kxLNBD<=JEbgIB#|jcH:K:6V('GFQ#Rk0Qͭb~NF7G ōfN GMAI3[+hb;1EL5qW&;'1šr&څ5[д$69,g4p0^Nn><'&(ûZǭI0ԕF FIRIM$gat;o *C}toɎB߆ZZX"w6!̅L9gY …\89:f i&c0(:63-hSNr=6c^d.sΙ*( ->7"ɝ( {Q@|Z"_Z+A3Wc1Q]eG|XHxĝ::l]v ]QiTѧ5ǷZjC2*wB !%Ѵ_CHpcEƑ2t kCeLѻhlEZX.W/xhWۈQAi k!r6oPZ7lA5*{JL'3FH$uI&(nu%qQs{otڟg_h4^zX5?/ |zcABg):0-_"u:; 0t^=e>.viWZxMvKiy~3Uʄ2QJ+m\}UX9w:g.dJVf [[ RD3hY-SX S靨ݺg.K2؉ՈE^t\r*쩒S &B32FY%@Hޛ@[#QZV'yˊ(F( +"MYGWAp#!kqcnhdwe3!ˀ@̒Yg2RWwF Ph=. ӏb8l,T[-cB@~G1RA*N[ 0(Ƥd(2K ivcЭTj WfK+aW6^+hF1s,(IE%2K fra4klhI3ЖlI{<&rKKTLhi ) ufP _*hcifp ;f>onjJ"HObМwy%^l=tׯ*!&p|oVhTSE\?k w7 W iZuY+z8~+/> G{~ߑ3fytOaoUf5jV<]o+*7Pn<0DfeL%x.rY%Y2e2Yr7N,vʒNO')+ !`3Y0aX&gˌ|wYN$y3J1FȤÜMZ*l40Kmuf xG(K t8ZjZܥ46Q=.{qPvi2_QmH2ՖWfX8O>+H6*0H'}RBOIl@~뷽9/npt>*+ޅُ7%N;n"{Ӂm Xћt:m َ(6+L_tڊ-}dnƝ3s 3:F.u,'vi-,QvID"Fq02([\\FLL@%X8D$rC(d8s^hQ@HeʐNK螶J]SiQ`Y:[cL>#4⥕T -HG4r9fӬ~r̺fg%-52zv(B&pC9 zhˇFIz:eSk0>xq1C~Y ]p(fU7Ul,eƳڏJl_*8o}-=';0&+ғݓ ˄|[-E.{/]ב${t++-y}e~٠\nehzu?YBAֿ3B;.bhE Ўzk[ľcm/vqg. SjQ=WB9dXuOn І˃*3%*s@ɽX-=ՐhH&׊P4]E۽q1n[F3t lfYFj`܇g̛BqN yr4bFlbL^?(5ˡ^(Q|l/D쿙0wx͟+'y~XĢv)E*W ̙P(LE['Q(3s;Ýg\K~%k5t$> Y0W/ jV lݗ}< ΢ O%k7srkh7BtT赍nRm^J> e!*2ǃsVxny`UPBZ x8; XGhgjP9j0K/wuKl7KWBjJN;-}gXB>Pu (є!#;fY\$WUG%Q؈v h{zd[^g\R;70(/溤::3z}4s_|rWً"^$$bŐ@ijןBC@UD/'eVkRZt_k%i#Xb43t, }Z?]oW֏oa+kx{Omg,1Ip#Ŧe|HFEI_Oy(Og?ǘ<6}ZRNwmMneϞ=~q_:M*^'/IM8zFHN*4HiPl&U5CFwȮ;Jn)N[ &h۴NB`=5:dYG5@ۂLZtfɖѻ@̀1T6`E,=3͵v> De(MrZOGk%@L9?m'rO5C^v"B&s-(@ޤg]PfAVcxΰ5~nҢ͂qNL/BJN<$hD2aB8zf)#Gq!Q !mOI}bɉ:7 j):kxw?L) hހhqņ0DF; @Q D(Zj~{NKldf{I L?TFO6$law[U;ȳu%t]+݀>/ v񘃄[|>܃QxjY% ?>Wb'Bu\W#(?.IZ ȊS4m0Fӈ kړ?GYpCRWكlDi:iD=-I^9 Lu)fMt\Œ乵<~*J=d:_4OxIX~Ĺ1PneykNҔ$%a6q&0^sڪ£Qk&g&[QӏsR31q$WۢtPW6E?9xF[keZ9VZ-pw>; 4/Q$٬3ϜTn5]kbv́ƍZ%1ą9 —\b.. Qv7Jc$ɍS)\K2s2$ur"3Cs#[4BK0uQhISR4>RɔSgLSc(Ef‚[: VbO4JB,"Du}/@T(c5XKdk,q}N@V]iQGg7D6aL `&L DtG&sRL" *bWDS N4D#V( `r%gxwIUjs@!c׈JJOP,@ >~n/U[[R [<6I+[NTav *.-!*nJN )}ߵa:zM-|7]^]E7L-mYQ-Os;rqYofաDC!ȂyP Ċ2܅5ء%0uЩI6tx-pʷē|F MJ'CO$b+)͕& ōPXm~ٔuI˂?|qK4*i ߍ{dxMW:7>UƧTj k֌Q  0\*0rF912eBpu9e|D,箰؁^5#̗L,oL'YcʠNj_Gc="gSڤr&HWGW7]UU<UD(6Sv]\Bq2X)r&78e%r,ӱEtG˴TӾb@B |ݭ9u /M 3ul b?xYReypc iWqPA %Fg_#Hj;4M4wbihix`+t<-"3 jsmkjDg\be4RHL T"43 j[Yo!6֛JJǬ")(˴TX,6r@f9(ZN3d-7m]t˯Rz`41 EqkEf{ŤȌs7GZq*a3- "w#-XJ ﶀ 䚉. ?t䐐Kjc-! T͵qs pR"buF_#HZ=p(mda2RdŒɨ%9XB*6Yg,Ypi tk"*hB ⸛+޶n-"lD70Ɍ(/@߸ 鼦@:aƍ/\osVʫ73'/_~^at<Ұm -uED)w )=,im$ww1 f<:4IyCo)LByޕI˱@FYR3HiCv"uǨ>o9F mhx9u"k߶ēm cWyzųB!u5Re/_eaDi=:2T5y7dЙp?|AA6L j%zXsZYNG\o$KQvh=\o#5R]훶*I/: D5„HJTH]h}GFQ܉ sbɰIjK͍8xFܩhHiu@ncR*n 'Db4 2I0b.';ҧ`ɪ,ĬˠĞuκ旒 [OkG0;&]EBROj/W웰P%']<9s Ý? w?:CP3P7';AA׌qΐS:qd1J0a^dQ؆D0F-HH̤ I Q 2SHfGLge`k&`'0 LNsANH*Jg/?<'^61TS.9TCFո,l0NC n XNLՂlYUix>3HvE)PvFXJ%'JjB`&kv]:dwhk>,ܬԙm.m@3蟓&ud\p;%vgGeg[h ~zYo8/~;FIKrŏ/'L ^M˹ܧq f*b`,9'6$'_9x)LCW-߰h$Y:wKي*2+)0*(g;̤b\=ShLDeŻ"mʄRW]oA-gvWpu]!bƐW*]M?>Ux 1rAwh/3(ύ|pAfGlR~ S_ u;cDR\1.ev_F=s釻t(_.fm:--2]RPlQֺu`%,뉌NaJ>nf.1\:e l$5csRYyWpAt:x70p8%P[Ox!B-VeiU_ *1k3Ѱ;IWnKs\l*h?\fx;Lኒvvl e2cBvgsKjMڬlGoɉBۉ^H5c=s!4[ C%qQ8tfr.֠8t,$ţ@c.Y𙳐x{݃IVO2rW_)bW:nȮK6g8OR DYdT\Ȕm[;)&=^8.>'v<{P|]m[`5c\;&obCvɜ.hadgI;04wV';޿Ά]Nn#!Hq`ϛL&ه;E#5~ټ'cz.m P$)PLr SzAp+#Ċ-2Al+aml-/^+/~߲XB^|>_|v<5m7_ޡS}ϣO.:KHzxᝂ #!!#dOv%pTkG htX蟣PO-9!ߐuĉ!p߈qd@^w5z)|v9J}wf~~at@Y{3kC<יQ]W r[?赶`G Kp>x/(:WV[x7M$`ˈ~bqPJB.Y= AѴ\/'o'TZ7/PwULXݘi]n^/8wGD\ 5e_?wԏ<ѕ<9O j!cZ`D=Mk@(Pca51/.1߭~T~G9B<BVl`^-/}lofG_s7"Dg5rb70*_A&~(&~>8B^9ּˋ_OP-͕t^ݶ՘ xM9 [(EiֶB{}&ڶ Јsa~&=B=ugU[DCZ5wwsdǘE!TQ]o9W õWaIfٛv N5y!-Yn,Rn/j5~U$zW6ɞ7zd$\08:Y11HQ:zȺ;w{twKm毹c7*?z.[󷡰Q 'Lp'aF"Ő@!)a3;JU@I*w5fk's{l|oӵf0 i`/kZ꾳x~GoLSAYA@H~ad33`Ԩ .9j]k cl bͿxu:H@i3@~E)%  y)zy°pTޒyYbxYaબJF%N ]*uau<{s_+l,ʞޫADY*͍)u+: }:įY*R?fTU9?˛Y;n@ #,=E s?*;F_o6*z|QA vX2!?c rGXMo,1sM{o,$BJU]s/ ሃKCt͡ӽv2ڪ9/Q!ve J ^x2܍KԀ_jƆxΆq}QhcȐK m&>2!DJ0W/ٗ1_ox.Ђ۷|xݎB{˃d2fK ;vj_vf_zH'>z FeΧATe9猢Ouӽ8 cÍhΟ}zn' "EA:ۃ8ꘕ>:u kG09M577N߅߇aYhM&EO!C$`LjXg|<@yyA#c4z1qv]+d_]Zu Mo-,LX*l'yyM ހ/5)RIl/zRUFZ*UnVC" $0 ??kg%z^q27tyNc?c$qh^+<}tA>㯫A)(ƺhŸcRs1[OK@ Dϛ4zz8ַ5'-VDIf>}{>}.ŸmyDj`BZl;ybz9D 3$:i`8MNgwnuS\wXNM 5hF@XӊϿOUԾ۠Vjùn#(DH,Ԓ-5Y8e~dh8,O*KL.ݢH EN5E3)[dTݏBfHوyVaNwJo6BcFg A N玬5?JB#v0ZpNl skazpAh{LH %;bFsDf "c,ˠA;mUڙL_"B$gZӮ:MA%xj %o XV̖Z@A66EkAF* LȘH99p9s!TK *eKt~uv򂀡]p-ZUssZ&J [^8*lțHc.?~~b+౺닋|u;[ˏ_7|`A2LOkò{f̼v?.ٕ'2- ѬgŔ E @}Z U&Bz.)Wk")&F(HBa+w()6erv]j8 kHA׌QК-tL&yfE MX+[隴W&@yZ(0e :蚡fx2T hw QVG_hNjX&0]_*|gڛ2_|˒OY/էԚ%-r20䎊UN`+:"w{mٛ.~.i+ ];zumV%݇JI{;w*h]ռK{V5>d_˒y~)AKK5(ETc"G1lSJZg+1&u]砧JTۮtg ;h`L|kk٥ R3dg135 0ps؇x>jSn,oZ!hu]g2c9)$(PtuاHiP̮}8Q>M" _ѕVv|#;g@PePəhXNTdlRٙ[<9 GkFVn^ZK mx2\Ƌ3#"4ZH-W/cE![\mOh/c"7-kmQT@JM&^q;n NӒ=uz7cP "LAlTYh}\2/=@9&^9JL:NtX'5Mr &xf_&Z_>{Xݻ]L.SFqB1fߕ')~ɟC7/{ -_D|pL?}M~VkD-mD3yAYEqOd7=(-VV( Yeźݔp}-\U8y@rLG1%:lB)]IMc؉N#ޒ+^XĴ_ Y]>&D&N;3Q, 't|6'-S섮[PFw^78Q%-}|C"S4sVkSߐ,gaL߷%cJ-CA3]挙dLhv#%>$;݀}%se0XфV@820zXtTʂ3.SqAk^q/f[Ğ?:IDpַDQЅ1rA!4;ݷ nm6PϓP~*N`hUNPw B ?XĺB{uh}2GAKeݮ=޷zxn#BˬuPxztanF_-$*KvDO/%Zwc)Ѫk%FOTI)Z3̥zWjKH$<>(8ZΨ`1J3ytX%b)hc$%FrYb$%Frٌ4)hJ1$ppAz&0{eWe0oT qet@\Vl`];DAӵuY3ZH[^ \nFBUA$jP`rd9I6.GȥV}Xcn!-oŽ: BG*Y2rƞ67J JyvOԘYF14I:Y@>z "5jo0dT XL_{0'QchdQh1@rTQa#`N 0%iR搹QF=).dAeZ iY 3, jrhbP-|BPSerԅ֩ v8luj)nǽngI{Q*J׮P8Wh\{&T21-vʹ_.^{)ȽN t7X>xAN80DqBˀO\aZS(iw_+rb|_ִr'oh/8s`*Г~1/`gZ%] %  'HWaU{r FFi1^@ N(%D8pYB[Le:}Fi ÓDVBB%ǷF$ϘœtDeUIɽKVr`^,.$0:*ɴh:F! hRR*gNrUJmjtYGzkqSʼnYt.ۿ. :EU 6n|Vw]fU=-rZNNF>]c\A ^Xݏ^Pi tCy1˾A_Aw9gO \6+:):2ADvP9I;ݏLo䝁89ҏ$ࡌ!P}iYgE ҎwrPCuӣq0NYY)C|g>tiPSɖ>"5m4"uE@Λ}CN'Mxv=K6ՁFH}Ew1c+p9xcnj5pQ-vH$1J~@ T@YU/+G\zeUo6 9lFQ|D]j{<]ly|u'n,7<: 7aEɅ4\wԃPJ!R9&cyE9 pbX`Ha\~ۓ"XhÝi`LN)<ά6ЬOSC{09{m"e< Zi0]KozP2nkJLjAbCp4f&=mf92 5u; 3t_n09Bόu9F1b<}X:noo@[ -u$WObE0$1N[1AZlj(N#%ІY>$xYi* mOPUڹoEa:9Ҝ| "O4eg$]ҁ6Iv1"m FSKm!CM.dfdQoA[:z+JvzrDi NX1)B,Ζ'9gH8xN 0c+hgcViβW9A bXMThlS̜`^^;$y!Ou \vK b`rځ3 pT#`+RgAhM`7S%ߵ %*|jh "3%,В8~LK!,,eW<+41$ '1$ϲ~0PqlR%]^i2Ö@ȠSÁd^0 g-7K(I{A37 =^xv{~@UNڣ;%:"b:3<$4FqN֠J5c b,W4! M2ƴ!; JVJ3f䯔YOp,5YQ :Oc8Ea–Z%h\hڐcH*uIjJ >Eu#E?_=JM3SVMgN -itz2 |21fcu.jW2i $h0W>}3}ȷ>Z)_\Y17zsV 8,c9IL&~ĄNņnfR;^@A4;ٛv0Kgm.WfK'/(0s $TDcǟZ4oA4BMɳOJZ?upV{8m! R#ڭ"G;2R|Ej/-hX|8~>G+;< ]s6W4\i3r۹{f.xLv")IN[Mђ $E5IXo `_`c"=еv$_ ;p >tnT/!=@KNHѽ?y{r*N:ʻ\)#oԒ\PByeVD|wLcW qvT4)O!fqhe`S-*7<6l,+l~Ow61g06#uHa?Mdج S)~Z'S[X;ՖXN3Yra-#TRl_/A\Ii2`lo3 Є?;kuUVWZ]խ |zqhLsKAa4bL5Gy)5q u2q1o@[Vds:U&A~ajr40r_-1'76doȇc Zn}JrUJ00f,ÌhiQ0a3FJB;Z< B͕n  ) GSKCOu;P?m*3y 0.]AȖYUګwv{q~Vs{YN9n.q:C ,0 0\AJBe[ON sޞT`P\`])ꗌdk=PYª8G_D^uV/7[U;^|p 1}80Vv5k<^Jy7q̻90n xt:͏΋Ns9P.x>v'PDG}+*96B·L&R@yN'\In `$+B ad;[.~r\ilCQ&ղ~K7}]z~SY]O̤(#wy}Ov]0-r>0sw|1E6/*.RǨ(/w~Q~>b9e$aSv@JYu#j}Nߩ2$[%ӝJz~Yce}:/\DdN~2Pb#:cn SOhtNj 6TRVRᬨZ/ۙ/߃Z{S?Ճ7y~go?;h糩=_ }d+|5vs8r kڌ߿~o&xl3%)OQw٤n-j$᪖c0Ai?KV?|a,zҌֺW9so_ |"XL_\A N 04pu$k=: ?_tZ?~衽&'lt= 65s;k}׷ôvg׷ҽ?لpѱq$2W!V Zݑ _'򞧠LFzs }J޿@r{e(LJ"Ew %puW:OZ6`%wUcwE|wYg,c`3lqe_Y!b޿J׆*:\om^lEe0lv&Dr >Ȕ7yB‘|,m9Za6{$(Q6A9v3F<3v\f[h]VƯQZN9"}Z=ZZѭ*)| "'v^{ҁ*b'9AN 4 ew.?5 rnM m짹K 3)0'z ZrsG{9zD7n s%&SMB@ 8IP\dt[yX*1/)gUhևt$(%m\ȑIS>LM^1|3M^I"ZcܝM~+PNg)ޤ-aIuڼ1>:Nęgz4 :6Yk(qNDbDԅ$DQ`PCXBm9[:I*_J&oBqq쁶>%[og-LhPTdQ1z2-BlƶN6ԅ,bB(0ӣ%Lbj.QИ?Jl6Ȫܾ-w$5 '_iSR1v"H^{KcwmNpm7e,~`/0NɇPW'-8`jGlsH춌xX!1L#"R'oz/PP9&Vб_vדFZknxs;^!$&%'Zyp:"I*GB0ҳ;Oq0UK9+ePX:8&@Fw|qlpI G>qg;0愨="^$O QDj#KWf249x'(A#dc;qHpŪЪKhv&]M<i_~pp'|sqB6UƹgJ6jOsG=ՈH҂h)ïR-2cj7Z9Y 4)UɆ 2t+ut1Yc5:jՓl7M]K4 .:ŋ406";4 &W*\M"͕E M2%1`! B * jL$ZSJiXJ{(SZP1 Pb-Z2x*4X1"z&zDBUHy хH/;*`5a#򜂂Wy;ؗtui8P82-a_yв_^ SUORCӪFW\r%Mn wRiacS6aTq A,6E/Y;t-( E5My!@joCW{}S`7^t#քk0> $!I !og֡9q<9pRb@9]ޑ~NYᵱj!,7F E 9$(9ٶ-n:n1b1Y/ra~tJ VX,9!`6a(LMZUY)0 &`F$De %6)G& CdGfAؓuYBoUH3I(Gz`.\B |`-dp|nZ)5E8Y F b9BB6 8tv.Mmw!ܴVYOKKH{^pf n4E n3Xp2b<ߺԊ Z ֞|-4gwLZ)9 uA lV|jZ&"9P)%;kʵzxr;Rc(͑"^3̱sJ#aV1az]pKЫq"(pa $y|I Xt.`CB,4T^yQ. U;X8պ$×"7` I'ruZ0UYXLiuB{T)I+,;WlzΑq}6M M^>T?óo&Ջx23= YwbacfY؅O7GQ閣RW4G_34{|UP;ۛͭ+)̃cY~NڋD@r=G/lΠ^aꀋio?7H&Z}TN9;k#ŵ lH|e%MG}\?Y.̥;8ÏZ(Uܓgט8J-W@` D̷_?_? +_\eGRnf>RPrAeL'?/wgԩ9y6>JBiC$|9,Jxt6[O($lnXO1Ș2ФR2T23HB8ikb\jn/̧orXJnjTfC @> +M&cC13,w<eQe9퍴nZBU&ð>u`)(v릂T2:yFZ bc+GO  K|ۺ-fBĠ[5ZE]=(O:È}:yƵl3r0KڼhsH;+2w,< &#MlQ>Fٖx8>@UظK&A(%W}cDHb1ʶ&t7VЛH2XpZVc2{:@&.~u2mXHCkSɂxx,흅{,ˀ[0Q c^;,K]0Q` GO7U\Z;cG]Fszzp?g3R c4Gw{6qj> ?ӃdZ1>{6Tu:QeXvp|Խ{DFy'$BBA]HD:*^a ݃3esC.{v^3no"/x鉉^zwSKSnJT bwRKT:fX/\مvncopD[zcQ1\/Ũ",$_RώWgGq)cώ*2?;iFCoD[lp6MRI8\\x>]6$=nB Η y(¡wLy{Ŗ\{)7L<e뤳IyO"]OZ"C4{:0{ 翢㿿\?+Ɣ1ј_܇j0k/#t~-.z ͼ rܼXj +ņGM. jT$.erLV"X?wiJ]4G"$C] 5c+qmsӇ|'.erfG^BIPHhzِ>ET OsIB\*4MDðQm+spx"ƶ=ϣkHFc8Dy,eB#S,1(%L#j AB4{y%}#]j%("܂?~P'rѠ4@wPiO=yn^p՞&[ Di(rmA10 -IN4ϗ ͛vUh@jUrK!ܽI!(ݺews[W""Rv;ĪuKAтvmKf.%yDju|o4;rG,GUA1PJdcfM? S_\BY-6&r8JheqҫwUW#ӶU׃"Y NoEw:t*}9_-q8YK&3YoYv9$HاOƠPsh㩜"v:~*U|I!(V&!vRy*d{fҟ9Ķ8u#l7v1|N3OEqϘ> G 1ܷN~ )"?z_(_!_*7 ;J8NMthpʜ;9vs/Z](`7,77 k ݰ{챙Q$s!2DMkiUΪO,Jjg _tN pLRO'#rZOᢼ\'R^<ȬGI |)@%Zd#۔gkR8E`ACA'dz #eWrB-ڀR` 4N2A':kK*#N,C%E'Oﮅz7Z;_;6b"82k#hL ă? #6  `GSŻ\mD`DtrbAXHB4ַ[CA,%cn</½b`Ngg{*l)zzquZV"jWvE{9i}۰VdޝpsLGF2]xiw4fWcU86x_xG|j)! QL|1WՎg8cJ.mƙ^|‚uǥ)vRT U?E̙lcݧ-o',k|dgk]&ԁ֘+f?=L| ~:(zk̨'rk ' eKW-嫰kvwSYKsW!L(r4lO(f1usF pU!U :0'* ei`T)*y )7v!`"ӝm@P6nH.!,&&l}o=X5z8Fr`9"v _\^L RtW1p>TeGW(\Fcy-U[X>ߏӽDi陋R'~UC|ќ+kSkJ/(S̗b4j:Q5z5W^n&1" IҀ΅IIPРdgF r(٬U=O۞6ˇq'3рd$dd̉ۯlg¢+N.!$HTך)]XGdyCA!p ǭ= rJ< BKǘ(] Y/ >4<##Dvt' vz=j;:Bmw_PL_ԅ|5vzXP)Օ6 ?QL,KcXLBLz7՗u2OZYɃ3( 3/ś/I>KKhn2JJu&UVkךJAQ'|jֲV!dK։| ]XNU38sNPmHwuW -6gyDBc߉4Zr܃(mB)m !bOn٬P\LA{l%6+dJ'8`Np9_Hщtc1LWǑH:V9h6|WKA7|!yq<8<_sQ{HɅ>oATw!a@y_rs {s%ZBT;] AsM\Cנ\a!uNbUT.Э#1F[GewWTD.v#bJ箙!iMQ j1]F)fS\A :u*)?Q^ |0lv+{@A0AXe #P+iԛp@Aţ$ndW:m ns;p- ʕ{$d@AH F#Fdg J9LlV)yJ{Eb*fa X,1ꔌ")CřR{ |a9l]#7X{t9R%kt_`n9+x@&x[m栦mm^@W5eu6}4iWT Zr,+,*V7\vղb]+ &rUFYYօӂJ$+ ^K1dCC$Dm&=8V^5zt̋7<+,orթg 32N>-#^B%gM>jd9Tooy%$fӤ52flM-NzuS,|[8u>x(I"w2C.Mڱ"LRrI;!+P/0XA-)Kl~pJqٹx+A ^^$OC@'Z&Xf\|jtO&eA5fsHβ?A5zk?4#I^Lj\pmqDw{GOj_PjWn'mӈ=-Vxn}NU}5Ko +1"މjj&MU_tr5L/L+qW9C!~K|{"Į'] iHܧ |6&Ygy\X5?q|;> '3"ZyiN])jut%y9V.5N £):b>Ave_n溠B/γ +:[/$K$6Pu!sGڢ A ;%xCےΑ%D:T IC %;l `VZ`4-si c*w~S 6V&e_$mRG~2p*rF_µ%fKchTE3WFgHv dNRպ71~#bn}P $hn Zi"!Cb"EȊ(14͗h(F73ǠB$_|?>.f?Y<ݎ F.:\?F ѽ4Gf5smYsKr~Foc (fQ$}̒rIT'軛/ }BIbt Hfᙃ_] 5d'Dp!ٕBH7:1ƑqJ"Mp^visSyęQAG%'!ktD LSDF)1"KXˮ Ւ]u?yre\q)95NsG&ͧө@.A`"g.ḻ\Uѿǿ{}<xC}xdHi A^zy@GfCPiQgZܶ_Q[vn)g+;cW{?lR*lRg*}/Du7%)zr%~(✋{A9 Rf>i*0RS#1S䭑pzb @!V8y \[G卌3Ϳ?_Ճ~p'n[37hNyNн!FjAl>gqA(R0eW8M2G1:Q(K=u!?8J2jΤ޸*n4ZEf(VD* N'I DxP&$ꠖZA =#a^ u&B˘+8g FfS$ˢPø*7y 9g$,MgrX,B2iA9F0xC1iW[r" 5Ej$I[l BmʩURrJ(ib<&ErK-À ʽњcwK4 aRdxh؉CNEAr C =CУ:RI()D$.YqJ`|1a g9,`dn}@/jzeZ^ݳZ>}sPoțۧ f[G<AD$&#WïPLg%FM>B32_OL 02ۂjonMDWD S2okǒJCCk{`EB+oq ]1 Lbj88ݔǁy{#Uo~BH-QaYX?kVB M>q2nV +S!#01߻,wؾtޚr׭wWu;̸=|_)Ac/G3z+#:W-E@,m7։]v0(L0V*& QmOOV*ꅭXy _,9sw4cطs}x͆av"Kx2Z__MP hAksX|\7rNw+erbɸ7Eؑk~Ff;Kv,Fl`T|FlVkO&I-2~uα^QL1_bgIp4ŪŪ.YA:+Φ;vmt h]:FL S:@h:^YD+z3A>tDpȧ?|@[}z *YD~%xoL";M{ ~%X "T>\Xkͯ%X5FLӞnXA0h%Xh橲(M ̉ӆ3![dA yG2eWe-j!K,?#3BnCprS#mSln  EҥHr YM؁͘)Sݳr:$el] tLF ~sˮv0#֕zxlǪ9NNW7g22d;J#/x R V 6`tQKy72 ^ka%R ¤*#N?sM!v*:~U{Me Yh eU܄*%8E8a&ÉΨH0T$y-i3>BxxOƩbF%NUq)Rv\i8|`_5["t#gz}o2{j{3|ӿ~40^>::{7֫>;7xNy.TֵL ;AeR}[d8@\|o877iWCc?L7կ^~ˑ~FAaG}~x_[Uޠ)c\x-{R0>e&ݻj-d&I!vcOiqgH/fݓø夡pVaj#۪Ti\|he)X*S3>7>c#z4yJ^88Q/> [b׸[C_ N3t<q5юzĴWy1b禍:gO,R;g?܌i(ccNӃ7]8JCϖ6.=8o%6B =- gw~_s^~h4B#N9J6.<8oBIBkCsG F|QnߴuzZlN(e;˖NT6=R:Gu8/ȡ_kt1S0PY5K)"2M'(^C&J zrŊ肿Ljqk)O*aDCM&cLŸ0Ze(< ʽњc#x%^ Ǡ&3)2kJxm*^-$ko$L9<;! _.hMq1pswC9BKq,.?CA"JjZj{r#v5N#N| مk-iiBͶ@\'0Y y AƁ@6N76;jΙu`>cZLU;-o) Wug'}bVCdч}a/O6C3f~W0OV[$s¦I7sw>)A^A2 ?1/!BS!"׵؟ :&Bٓ+A9+ㅎEOO@C:DsQA㼸'я\֧Ja{?|?;Nz4J^_ˉd廋p@}Śubqf U{J]e44~=f~L w2.E, uU! }2 z7O ݵg,g'|˴o]kbR?!Db[*N g󓩑lU0ٲ '/0mS}Â,?"z#ܕBH\*%]4§'_mD7@DtĠD>Qt;Z aGѭ]4ʧ [d.0J2_zƅSd^nP|y.cBtO`„gFKW~tӡ\l:W~kS2H98%q¤V%8SK)!Ms̓:w/=6Bkfkq^uDpT[&jQzuƜ-r4B-RJZΘ!ɞ{jj3f v gO'p$o #[jug| TF=7բZ\D wU);ky5Mxt˥4Bq};JvPD]a;BR#@wE?cp!$leW(Pp&Fqfxۼ) GQar1bH{7p_M%@[h&[FRz5@b[Rrm %笃,%lLҫ-.5nmey>ei^ Zɨ"ph'̈́툱IS\رZd.h*RgZ[jZTmlN6[/j#g·dzy@^8^PDz4L; k)Q^/x5+VGQ|LL Y.O_>ol`&LU߽`F;TH$#Y4e*y4>Yבf!'77D7Q0y{;`_KL= !Lv HsO20(N0EceRv  _~ hzx^85dX;E;\O6&ȩxĐ>w-u,RIHD‘<.X}utv *OnW.H5IW .:(p9 {nydļBU{gZX&s9b̓Nej\3{5RwP+Zp~UW$eg r kڶj.϶x|l#64_^T͐~%j3vmc?wo ;JUOWRWlK%TRa 8p(. E"s,==8P{t S4\`RP|f<фCj|6)T.0j+ᒚ`Z =AK0sS&  zRװ@Rr ʅF/0qNhDsf*xMq٫(ʵ T*W۱xd"e;d%~ PqoҊ8VLl,ʩ:S{0XU /K\x=.6%B)gI4] KRkH->+U(TwXK;|Z;hok|dAuQz?֯(Nt_5UTFQ;NOmƳr08+9h,@3Ju iDj $;(~p^o A,y{{5CLYQ\/>t;d<018Ts)r4A!6^9gy@d+ \ܶ;Ba>c,֝X͎}̈́h1Ŕn积>Efy;r^PiG>b* 0!c[ڝ, OGP7X.7f44OhIS66T`ie;/?ݶGnˢJOj^_b+$Rt$.cST fTsR"UNUE2xdn.el{LM^\5Ci_}ko-W2+gKekW'Ew d": >$nu^WƞO$M?}̉1WT&h)N/CyGRQ`47 4JCpɒ9MW/Ө&K5aο_^٦ߓ}9.Z(_YQ7 CjpĄӆj_be\ьSRIJi{ ezN(}{]1)WCTוdw=/@):n׽7qcMmg^iugm`&0ю5\ԺFdENsG^Y_]~xgWEQہQujAR@knfޫ )޿l/6A 2Le(4*S\GIo!M>wVB*tmCNq)84tκ.b190۵>jɅ>kaۊqpɕ{R:)q <_!|4S Q@5Qas DMtCay3x0x_%3(s.]3z]'t>Ւ uVKn3Qo6VR;#PF6BxY/KgyXg`B#wgul“Cc#RB۹314R N_dwff; RT^">xe;3 DPJ)gR g2uTE2K_tPrSpNӴJPyT?0{@L:>nj#:f}OLn?rվ͚[kZ 쀕ORb>q-I7aڧHƞR@rz3]y%`BA> JEz.zR۸jx{JItg6~삓`?Zc+`ۃpk^$G;k= QY 0Gj-Zۡ_̂ K- ~oM-:q;xw<7e^#Ft--ĒrkȠbTH%Qlv4L{8Mؕ˶qaPޠߌ?z ւq|}w<=v;:23g߽}"*uL ƥ!A..cQ$G&Erq>CdN*'P$xa0k vd0^bU9-3 Dn% `鍆\@Ǯ >rJ|25@>AҨa<< …T;ig]PY͵l l>tzk31:uJA3 <7j~:GbkNqhS!pɯghWA7>M+n/|W/_ ͛_O'FWЬ)8o 6àU=7yȚܓM7lP~U46yJc< ῀/M4N!L^#F>% y⿯ﮮ߽|Xү&rO7ۛ7//7ś.6//7׋?][y7^]܍I~-7]w+w~=A~8'su5&ޚ?G[ioZp۫Vd-:mձmyr=;'ecH}KoOX?MȜ?-un? z}>6J}G&0x lL9`ټ({Gs2bu!:fg^Bɷfiu7b0&l8m6)n_/7=DVJƁR055wۥwV5lAW۷FO'WP~ss js;h=zoW诛'Jj7^DL%on1A^nv~nL_qn;{|}؋Aoԇgcrmuޗ(- TuP.lѽ/ }ퟓwCd܄70㏿tݸSý.=qpb8nYJ/uO7 '}=ޱ;.WdI~dn`JJ{M'40(;Lєf&A]`"*hĠxDȡ(P=_`75CfwLͣZ÷atᘟ) -?;0|,@X?ee"T7ahe* kQb?~ܥ:k_imvfei9ڑBőXSv8vhqi!;o\Xa?M M9!~0( : E3\}hrMgcrxL=<Ghr<59`^PSetfap xbSVDHTrz3 Z,SmN`Bw:gmǓ1ھ IP_qR,<@a@ hbIQ$ B Pbf~)C̸,(ܵS9b32hKrnՔWMbBG9L\#uk4eRROR#Xpi\} 4r=qk8ts5i.u5S8ӳ.gG=j؄F'bm EF@c_>2ǔ8jA! TM _`6Q'| a_V,y$.332e3 */ֆɜ 8^HemEI'(ΒwL ي\Ks U[iK]V,Qn3keI 3OLޣ {p]cl%&ᴱ3u VRgh)oKtu{>M/7FS{v^'ӼFƉֳ4EOgei6V2DTW[X`K[?3ZeM2Ze*,UU9U\j#z4%3єh%oxe3OG4yJTFgACLYQʼM̈)(10/uR:qSW1`&i 6gNbPy#F9:Fff4)=f4[f1 ge4cj%6?[=aV=I&/z [%m݄BpeI4締o(B)O>}-pA)\ CrĩB JW_e+6]&l1FQ\,. u2%VYř,Xȿ1GwQjPX`eYmշ!UBl%jdaJN)Z) DN_v'H-#FyquUO)-Gh0o+ PNNۯTz:j:3;`ʲ};3QԺڣvVN.l[\lニ|T)22ڽ|THC*JR:|MsA9[Ō{RTiEwnJA*:@C%'(iLqRYö9S*H݉|.4#zJ83qge}9H2ĶGgCfWjJun.v&k.4%B.IbWcK*tњ2^<$%K%ef [ >g­] UrC:5w2C?;l}!-70ꈘ_t Cm%BkL$Mx I.chxL,؊_dL) OYПqjm~fթOs\͎X[،*KAꢬQh=lϋ=T؆]t kDt"*āuqnΎ4?? ^aǐT8[/G[}~2.+ \i8(V.~>7A! ~.K1)U= Ty\X֩vT9>%˶qA˨ޠߌ?ۚ sFZ漞kygWtNQPgwo_yRUhv9f$F7ӁT\Rj;ݐ)$*qd|WʾT@(.$BZ$:RLx`_ xvCjEQa=o#(ġpP=JQ"\GO=@1u ι.p|*+=DzCar@<@&WHArӁ1+ޅ#T|R.eê}s^l}+^o3+OR{?!6;*;Xe%o4X*4`iXd[uDӀƲ2?:U( 6Y|.ok˜o<[ARy`=8S:χ#Jn6RxC '×uj]zX$5l'pCYa^ԶfgɖgܮUl%vبluA?AꙠR8; dA ?Ct7?ngq-hF$DՀe'>`͒aDI]w8nf(|q]/ٖdɖK^jw2o+Gl$OiϧkF~^gn0]/>hX6Mҹj6U3pr^^˽M&] h A7wi^;4n{.[(v,<c$B&ԔYŬ4ShH(eJPjDx%:@3" IWm&bwm>t~ƚ?/(L[Wi4i?&=mˬ }UouA}߸wO/o]a{蚃?.$׎OK?G/yy?=#׹{3@۷u2{6I=K?nɵo~v'ƙb YkFI}Ti}?+pZ)'kk0 w{U6Xd==;37X|Z;Ⱍ#)ˁҽ:tZ(lZDK2MﭏiM2xAQsMGngB_Q+x UJ: n`{C( !~c=w} ZΤ1T؂&В?P#C&=6oAoT[ᅤ>8 =NbnMbgnL;p뤓}W(ANu_w||#Y.L=U [n5_Lݸ={aAEȂ Z_~;}ZsO@r6m~;yzҼuQlH~n41d tr|nwNr=Fቃ2s\;-{@'_Ak 32$a~2]7af>ÜHL>Q_< 9-k`ܖ%7dᄐ{jYnߖ'pXp#&!3Z0bGS͝LN!'甆ǎZdC U#X?'ޟL7SoBGLqtу_ÇΞB*(s=s}纮#T܃ʼ\'+Wߍ@^tt /}])t|jt>;nm EUutUVo{iǖ_0*ol5䬾!pS2Ka<~}  0nFCN~^ F_t"S#ݺ|OF'F/bk6_ld31ZڧjjOPQaV8ꂲɔZtepj(~oU3 j:y{PG>>L|;?jT%TH˧ɼ{IV(|W:n]hktVۛ4 ='}]XiloosIEg>lE\XdYlBE1Bđ5 EB0 9 H9E"-r +VQ3YSOD PO/^;Y&! xg\q[T0o_992NF&MX@w߶zIj}&*FT*z]D?%" < Bz}CAnZj(XPѢ+Hxb6_ >Hgվ6mҥ1P0Pg \ pEkܭ! 2 [+_$  vI-tZv/!ޮ^Ll2ڮG?>Yip&c#լ=sy9Es q1/V- BOWY(fWwp3`Wg!b7 ub‘\V8SkU3 ݽ~oED"g%uFLb#pK-7HXBpbM*r 'Aוza3xFڰP~@~.?jq(*$4T C̔ *9 I@[j4$a>  N#p=BZ2T8>MCچ:6D*i4~:L!VHr9[[`0E"&!7i ݕ CKthVC2c%J9 R)hKkpl~UWGK@6d)ddr䏠[Fa?Jß*JtKh@q#hCE暃x]t3Lq8FIi}RCaRN0b"ƌ&λw_؟e<~~BN^)`URq nP=( &_5_VK`dj;8'Xk/1CFP{v~J:Izlk?^! ^;|@]a,Ŝ0g+H/U0} uZ *U l'$(MqQ`F KdIյw8A(BXi>Ai-^S珐݄nkb5gH9Ar6wF`]aD*<'p;>|h5Vd<HKICY$**oG0 pO$b, c㧉=w_.O#u5l}~&A`? n^]NK?ƖiQ ȗ 5qlFAk Eb@b Œ Gc`5ױ{3ٱ޵6n$v?_̇E;EO~qYHd_$KDJ<,,^]]> LuϷRO3ofv~RB/¥,zKI ?mAafxZ#:Gj,P|I-+QT ѨE= D͌EYRT&K 4Y2Fm7HNO"9=Ei\WqIRPF\LUm)?"ZTJ( r&T,RR~02 c &9%/L4 9dN@%Ֆf[mJW=GӋKTe lQu&jdM+g6᧴!W;)i7ZJ 'P)xD +)W]Yw4?GrmorY_08ZU M7E])km(!X/>Uũ", 4Q 8}`Ui t(Jq$VQEiϕiVxMLrͳC;Vf9a}ikS Х ̠i"B+OP*۠d@9`Ђ ߄^LFF7ДsAD"r[j 3W1r4s.g9qzͿfߨ@œ"tZ0$ ڛG!s"#qʭGG^'̿ (w;rG nof6-a4ڒ̡\<"WQ/{T1A)qRUOR^E /EcRB:]>{ҷ wlz ΎHKe.ϳKh:[{ÚB'dcZ؄#sEs6L\HZ шQHj,DI+a T 6d;$Q0G;LU#""55Lㅢ:CvpuDbhӀbX\tTh/-32mE+%X& 2G1T.u/hk"GmEnb@\GKkICOVby<͊ciX:Jת :hykU$zJwԪ+n 0hhP0h~ ;H٦ ܠESgJb(םgcMyFRx؝ Q:qw~T`h_^/X6yboh|2[h;,&3XQ1nx9C% `$3ְݞ/Ļw?ceuOHw ޷ig hj)_L^zFrs=m9u*>cLA©<||VS|(?U=L#Td ѣG[]4zt}FWxhhWh.ό=+]<,vuZ?2i:mPfX;4cN{QL7գ^w"hH#L ||t=8bvGߡvmO/OsxQj.]jh[䄔XZ0m𱹥X5;nmYs|ۡ`li%>N(T iA11<Cz8W%U:(f0lRsзgҒL}i-M596y_[wȷ'CѽhKmLEnù3 0}֩;umc_t<(Q>2gVPY YLO6:R׹#oI^qڀ|_݉O @U_SD;>>|z62BVZ&B@LAw C'^*YL0+PT+<,2b}2 QPÍ53/|T"'e{{Q=\>F̓p~ix!R4ߏ)'_0KpŚ۴Xlܗ~H} Ŋ!(n¸Z@v<نUԯa')2y~įeXO G}JcX sQ,iURDŽ#sk6RՊ)`/ѭ#vNw^2/U[񣶷󤚽'eK_>QoUǪ# [[DQ3F;bk(r^-9Rߩ|!bxΞ>NTK}HAQ]pwi9զSfXa:] osy'>%;&5ב)J.u>im+x4\r+E ' 8h1W*!Zo}x& E 4 Е+ *{j bi"]4{i?tNaفFD뻎7- h|44H8rio1hn"\5}˃Dv ^R3neٜzV gIz?f1 q-6glLu[(Q9$~< NHМKԲHrOӾ--%y5u,R =p+`W֑$zZ|?{Wȑ lIy 4X`6̋ !mԊ| o$IQ%Hf-m%+/2#"+2ig3W.`ƅLF[(Ё;cD29Y,i:KsJf(:L~Ad a +,ej'\f4cĉP'_͖Ҝ6鋢-/g#҆w 2=R`d0'{ky =g XIa# طnu c±myZɯbP3꒚WA!ϾοeY mԉ~ JɎicђ ujY稫w$:7b%;s ]&qWԯe2y W3$tiRh)䌯lHVDɽ-~'k5Ȩ}/:Jv/ۅK:L\i{i5UPZjnAtWQ\e8E,`0,HQĄP@iC3n)NdsJwMo'EZ&m~ώzK ;Uw?y~$9TYly{J4(JN4B]ۿ}p7X0i(.j7lAHggV6T8fL:`~tlYM[>cotKQaa!Ӭi;$!AWSdBUa ߜnyb;N{<clH5=)a ͭdN_eNNc&PY0camP1&xV$}L/s,&HaCDRlTa$kgxĈ%AaG#Nks5S[[ V"ђظ/ u!V}dtW%)uݏ{vOC0u L5g/n,|4]NSaE[v(iKα?%|r Awj($E~8f ד.zrzyVMǬ0mT)K{`:FْQL«g]+ZwDj~xG 9G;#ΐl="yu#^9f%Ne\-U[[ftWTVQEw9R~^@}30peF5ƼkF\7Xwʃo#lZD>zw!)<&Vg72z] 3s1Ymx!۶ͯ^W9gAdŹm.FRNis.F*W.Õ񷑷]$1y=o{)0%#~,tQ7C4 8GN/߼2wq1n~"C)a}'χ r3cء|yc:ͯg4xϻ ru`k5u ik5PFi{i֗9_)d)ΌiB<c >drYקtpJ&/',,lLCen{\שwup]D3"oQTz`鍁Kļ)=;,d:Ia<MQ 3=п >=H6y,u\Pb$LJ6W>EfD /mNEٞ*N!{d9,B*+ Ri: gR68ANq~=i_ c*E<5W9Pq4Nؑغh+?uh`#C& ю(8 ہg]< dG|`!u%Z˖>;['H:gg?~j [A/4o;p=x vx8d//fn_SoudD빖Nf"pLx??^\8 fz<\bYˮc7 H MI6Dcpq  (z9NTVBO`-UIe#[IӓaOcŜ„vJ K72\T& K) E NRV,in0uPYĩB \4[@$(BT'&ՈrUgzORIBP4e$\*L**"[pԧRP%F5P{Y#l"Eָ )>t`~-vޔg}-y@R#g>=KR!XZ~|ECpƦAD/^?p1Çt=G__ n[3mQa+;V?Fp}B~_1J w1/mMqZj0q I\*PӁlwG̤Vj)]PCyq+հ0Lx֌bJ&=*psA)Æcq4:jfe#0-}ݩwY&0 FU #&6i?5BF:O:#iGR (De`8S8`.kob$].b$N(ob:^,釬G]J>%EZ5HьT @$L "+€Opb_ 7U\&/N-JHjָ̈54N ,(bWq(9NumYXqSΜX8@^K/K?-bw2/>eMMn?#g2UF}ykFLLj`PaTT ZU[Zv!a$Cottz?ywu,/!"رyWqzsu;2](>}_짋w?4dV^tPT qMӖZ pu䤢+ć/~y0{2*鼌V)M'BoaSӍZtR}\(1Ƭ5ΚGYk'V)>i#]bOƆhE G-T fꩴX{Fd}j#nr>NoD[db[q1;4~N*V(EUD *` i;(PJ^dp()SHI$ޢՏ1|j8nRK.)#\3^|r#o=I "ijK}‚rO/vz4AN-i R|x١:u(BTxd-:9)Q#oSEys>wHQToN)ǟۯEKCy· O/MP薍tzsԢ~ Fr(2*ȫʨ+5ҫ͉#9=F`jH:R[® ].o)}v݄D߄9Bzn%Z5hn+2z&}qL&z]O~n}8)bkggTz [,!aI>[K]Rr*h1 !hKo}|r 1fk$>2LC.P/_WUJGR* Iò-.ߝ6q䎣b)J%x<)d?T_noQjO]g]t>aZfA96zPBp|"b6~*qS=J4&%Ee眵GqF SI5 WQ%rnUdc)JqEq;Q3;opPTS|}q.p8GJ- l,4<be~0D4JqBfsUt6|*hHFCe`KbqhFAkŒ*!\ 2yDhh$qE4$K011iJc4FoQБ ;4` mYR^ċ4:d9F{*%bI&ං׊ hcH/*4^K]?()]F }N$IxQ`vZЭ]ghd?Kl񶳏>f8H1RQU"X_9.DSFڭEktvAWߋe8?h@e:z5d!\B qWH_~|o8邷4e݀\#2lfԡ,xԳxDx9hBԊ>a.v1m&%br0H1ۍ**@zs& }W.ŸPuOi.FxʑUs[ᔾ_E+E?m/PԌ"M|GFY|eG]C6Ÿ`uS綢?t8q|&b߽iʜFm~K'.4ߴGmByzOT#2脢 nxլMolX9Q6$*)9шD &E%.1/ħ{u%ހ8(A8CR-h(Ɠ輣22TH6p="+ČZ{Z558n%Ķ7{gE2tܦ ^Ζj(1h.Ꝼ4XRpqSځ$MN[jA * W-xQB=INxE!Xt7!waPQ*91MĨT̉k x`8rGA1Tf{;@E=uű )bH~na5-:n?9搸@U LZ]D64%Ia%9@ȧPO4Yp-V"i@'Vzdz \.# VgZ-R}f,eB:i4\-. V(hsAx NV\J Uw#*7}+{ܖ*jǫPQkr-U]sn DB/ZB[[aIc Z)w(z:uL7u,ZiY;mtߢV%Awki tT${m$b̟# ժ`;5*צ!̴w[ tpyT!rQ@keZX9R\\*jMN}xߖ].k_o_Iw_5Bckqf]]qcr=^5]\tE9hEW*ٝ_@W=WyEVEnu*w~ Pk[\@A4O1 (XY%S ̈"a[;+4BZ%V Euvr2_<| mCH~ycɤK>*P'hɇNRVK驟TS|߭]!bh6Ko5-~orwbG#wWOח -;`a Vrj,7?6?&ą T}+ #XE'Qx>~VIEy$|X8=P~>fg3l7e,oWRP5<\=j. _Mq"o9]L% ô}'"\wv"mvSf~Dgf6\ɡ)zHG'ÃiRHsCO{4vPwWWoR8&.>{z(BƥaJ 5j5kGfZ*0t`敖 ac)})o?Nc8SOKm{֥GPn[k[)'L^Kk~JP1=%@q_`,h?1WZ޼/ɕVw}|>.6І =0q0qW]PV9Lpo5.%AZ* Ϙ`F>jG3OP{ O/ #P6|}(ATv(xb "@s,᪶8KEer,4KZYP@GG#Y='d 7EF>w*%=X#KuH!JQӼCgԄ0K5^)(ʟ?RԀ4fK .=ɶc cfJ@1kS !rއ9գr2yB3B 5-C{L:Lλ T2y'I)qoC<.[g{wK+7A13peD^8T1F͸ Ǔ0dMzQkS+TՄm2[ȶޕ; tC mR%1[Eeӊ8QbSKzyqu3_? EU?Iclg!0B!ŕ MO( V4~CA# A&90 jGGiYo"MybԊ..k4Å)AiYD(J uIJ z6#uS]-'h5hHHQQG4Mxsg&DPQ|3c5q(9q;aW+8=<]o%B"A3T"R!6Qz&H6ZMNhhsnSYwR^w',ɍ!䈇;Mń#$p ǫ. Af3xA^Ϭ!b5?oφ^R$DP>9bvIHo9iٻmW|YUe|(fR`vE@QTֱSۙt%eǖm٦DJg\LI,SCCCֳ/)4dS"GJ5{R-yhdG]ؠ"Yi3ao L >{n)- 2F̠VWsbpcl;P v65 ( b[RbjZ]5|2؅"se8 %/jR5x*m5ãofd>̭n֝o9z:L&onn#hZ@dMjąPC) R9(SPilS̮{_tu#12,z{5]W+_fWQ>'@jй} ;{E[*И4~51rc ɥ1,KFDR;Tx"!8F*V|?o(s~E+A<^C_yc>Cw^?" [;g?a`_|{b ɲ F?^ t6_ _AR|t Zp.d\ Fj6nո"K v{\3Xd ?^cۖXU{ qi=ꪠ[}QhGdYӉSY!IrR 52ei,@ ak)S,MV;¼wvLxQ4|npb x5FXA&eZEbw%ASAT lq5,c}Ͼ`-acbM2*Ugg91TqC3"Q*ց*j}$h\`-)'q卑9a6J(,+fD ,ACRRQ< y2knDTHɬF )})*̤:H(Z'5oaV##US bQrgU#W!RsqyZns4Xm|fO؍]j$Q񺗼է!%݉_AE)U*.^[)gdmi{TA}Ska r;NQBL[q괮PAJ)>--^(0$=10!rHHV@Mv.Y/}P]a$@yjFjy9*ھWz I:aşL?T0&vHBJ%{tBc0{3sǗ1FzS  MT7G3(T!r׈3 g/l. Hq( )4IRr iʅȑ&yJ֚9r4 b.ʹxUOif|~q13?~pA>< &fO_\i3'k㹙ըs .\!ZܜX?$d!b3Dd(?Ř5yϱ({ݷǏ|l ݁0l&9K $pINC>5JppgLaP.\Uj~R)p(9nr_l`GS_5%#Ee٭›ޗ5!#&\6B/j N>ȏl eȗV'!+,4Ŷ gd[q=Ps*"C}Y=CN!E!Nu;]ňSId/l&˼ r?JAaV0vd zyLϏ]O\.7棚ތ7n'ګ&!g؝Ay-<GeCi!F'"wJ١Р<-\ߡ߭ Zwyt3w. A'.fcV؟_USN;U~_V6IR32F]P`0i  &9F (2*u8zfp2pá/z3X.@[ele7XbH)nF\N,Rdf+ A)$ca @!P(\`d5| _!P.EwV%KƧFұ^E|.S@;M%4ؤ ƷJ8GT+pyF2Tk0dgPNW-\(3FN4ϭ̰.}>:R2*Hv ]G6V׻ȳXHDZ- c&棆s龜߁cu0ZG>z盥wÖm#bOiv^ӗ֝`ƥv -B>I{g%leB&(ڨh8sε"ݎ!Қ_a:DҢ"aY +T2@OsneţY!Cדr>mcy{U#BQʱ\X?Ϝ`ҭ_5[H.'4ZiJ>kx7-fH DF@ p9]iADcɉE nF4g #%ucP%˲4 ž2cwՅ. ;2o 5lI-e#.ɬP!9Jq)MYΩ_5,LRoƓ0m'V<`|ޑZ'P].=yDNvk4 ONX T:萆>]/JKizўo(l'qU 6RG.r=)-Kr-I4^H$48aZ2J"(!D 'p 2UL—}dg^9C-/ݝ->W[^`oG+@Xxӯ^yG@"d+,˾zVofvt =?,).!=ݶXb~]O/fH9.]$tihf0elZYU=҂<\+5XP7Z|OY淹s%*j>>8V27_-]uI9/S`K{SKs|:핟.I}/ɻN'cHՋ j /j#{΋˶1g"8K/GCcrMD[@9& *mc;LF??Tɞw6Jj '')ޱz`aj:yR.8W 2P>ivl7ӌ؞W}Ķ\\&6rKW6c ENAvoǖI(,l D5! mѦ7z }KeT_Ű)pUAnƿlx=]U:ltD54/%vG^[gu(䜉ME,~:l F;,7X<9b Ǩ( >К!PQ5CA\/F].k'E^ͧαv*ORpB;~U#y Hi-BulghA@R"fOF"`?qg)jhmw>4XW )Ϯ%%.LuvN DԌۤf%Ps<D!_^{SZP#ٹY6zz6꿪]ZJzQQ缶4jAL應̿Z3 ;v> yb$[W\*v((ri*[5qrZ5_Xbr*a_9í'/gEU.~SϨ2W*!dĽE #ٻƍdWK̀2Nlvg0&/ zx$C9)٦I"V Eů»ZTZMے_RRƓkZbn:>ȗEWp7Ц8FrDiE(D8juxk!oQޏw;P4ߊm`-Pш`LJtqM'c|ä5H{ 5hZn&wA @ǡ:icʋWOeATi[Խfa6﯎*}3Cʳ^T>7}vڛ254uUE|SVAW>ْ3⊵u F}FpёA,.XϪa^7ldoj^֊I߮yO_V }+zN3ܡ5@CE>z&DA__v9f4'{-zUx_J^2*11@["Q SMCZf%VA* ӹ>ǫ 6IXҐ'$zj}cFBtx(^=I{c!fGQ Nz(CQ+#unZv3aQ͹Kt̲Db$JHN9' #'N {}_낓:(Bxl y@TFl GTxjaHMs՜D,Tێw^\ #li~ԨKC{[׍d?ʫRڈ@Ph0hl `+p2D&HJQv1yK&I}FFdL`YDAgDRPF yDhI!pCIG+ ,+xyi~݇Ah_S:^PIof'}SI=8C'-QuJNFyW5Tszja2Lo kf?Y(BJA u2yᇷ-jчJRtN6#Zp d9(҆b䒶DF(eD醹KO[4$]JxF.3w(gpI(5L!Ćh= Il(xYHO#@Z)5{?.n>NK5KN[s1*1r" AiLpo.yZ.4dfZ KzL)KuEq3,φg1J޴c`C{XA,`y!C^޻og`y"鴔iit-Tu+Y>EZȾMd^kuD3zA-(6Gk2fzm{f iǀ2t[+9sѶ PLPm-c`a+c m eٯ)>^o`-hW0;XmMm Ңz7RM2[fs^A`QD6޿h!h{Z5{vFԟc42(VYL+ W%MVosRv{ӵ+ƽ7d+cIe.hָ'眚4@J=vN(O*%(+Ws)Hdy-zcdz)Ņ0>5Hu&7%ZmP%=`)>]fg7#wxYVVggyy 2 ldɈ[N`L0r]8s@aBө4>y:2V3Tc4r! 4pB-g+BO{|\?:P_>n]>ՀMjfmI*Ӏt Tω2Z9Ok㹓Qx*#G%62+b^jJ(R7T*v=YƌXt >X7IaF$f)T t:G>R΢NDrOI q Ԥ6H#*HJp< zg;׋T_?}tE<ۙ?AϷ;9Ýz|u,&}!SWY`* L]S;h2tT0Z1h1R=3JE@oxQ6%ǣ.t:5KX([A 04kig?s0yuItC$*a(蕨yT"PlbQ 5!KTI`n_ CJ]ŠR7H,g/Q~>'ZwyUZl`HKP-"]yU9[Uxe8[֢=bڙ2{3 9#T nH>>ǩ]㾻~Ο܌.,tY/Nߌxh<ӆW<`G{sWb:3Y`!!;tzLƃ$lo`qO^][zts]C6#ѰyuY؄Wwdlc#*r8j8v'`o+1b?($;Dawӊ58,#~x[|E-\BCjѴO,|$ ;g 0Od}V#ٷYȘѲmvlSWZS%Oqsl9yr&-nA3L!?~h|}w~aࢷ_I0w e w膭a-:9gKuĉ3r\qQ;cM%ptsV1yq˘t}"*~sG}bLZEA&Q5L2 @ AyЊ(-d7Ljk-If+9ӄ `K/qSx.Ov8e *VVԖ] )>y-ećLJBhLAim z RNR1OpAsO s &y\`IIA6r,r.V:fxK!2!1UŤ8f%EkCN+C-P؀)[Z^cT!=w!oacpU-wwqۢtq w- ^o?T_w<}>?,>~|{qC‡O g<n77 ~Jπ{7jQQ;RY"#pmP*Y ro~G ߗu .8"U)DRGR+t?PlQ+HSP(z%KXQC- ZAEgNiF,XI!}b֢I.8c1yk7Zjuт%J!<ec/vp+1|8I.­n&ϊt4Նs:`e`E#g(_[4|?0XLD"d$8M4&.G4ڴddSk\::x KD0i{=b9E1&?*\PAv$wfpJЪkny"`x -V,i_^fe^T4R-&IQ{ H\:$#,$xDGR9(Xk &AUvZF0qEZ5I"wE $$q+5I' ʟ,th¼,8˙?O~T7ݐ՚ՂVxVc܂]֙N%f|NCY9m$Hl:q\sSc(ԉ.H E:%QfA!R[e#e,숵֡Z?߷JݺޝLdc0-hL]L#AFwrR5$tSAMR~c~} *tBn l\ۅĸr&UXXssM)V#҆BjK!FHsڷ&pĢw(fU(tGuC}@"EiB )ꎅmXU`!"O,]b)<%:7Uʆ<σE [o)~lӋ魟O:@_ 5}pw? y|ܓSm y7Y-qϘSyv&q-[J|;fjNy랟$BɠcuoՠpPwy>0 w4_;IurM+~ _3|Y`kfR6 wdjzx܍ԣv\o{d肙D1s}wg/FGwwhpJ{Oq/դp-u&(E'^V_AO$0ɯL_s8s L㌶[s]f0O$j_lǣ^: DJz<8Rkę+ٜg *VV+&Z4ŠM* ĭV9.ҡ0{,ו:\Zxr8SD|hJ2x[RcJGssQ#!+?MX(kY 4⇰XMkjqZޯ 5%5 9RKovfҬ+/gkf2ZN#%g` W @`U(gVr.ƫҊ{S8c ca! {8žhrb[(q뵣gՔGL3#QIw[a?̇ؽC.R")vs4!%νoT %ؚX3} ((JnH94 BS8^PG "H [ݻڒg/R]Q&vAJq=tp,:wiSkf.l|yE@.`PYm &#XFd1 e*Mm$cSQ̓sA%̛d2!CGj^| = D OE$[y-2VB${Y{ym={ɷ%1tF'FXF~D & R5dD6G8W`h?}vbA!AhzN0W8(jy)&ðWJVm8?,σ]oo?oQTz$W1p\%b^jhG)5;jEDӭi m֕ s FwLjax  BDGNLp4@cy"D1KY{@L"]h$S^`E5ERz9XP$xIn]A:%5tJfbeZ:gذJ1>$%|N uBڱUI?X7+ؼwͰةຓw1ը?fQX`W bY9v' 2eC­]OFZ zDh3vEzFA,BmiM)ϯ 6m#Afd0ƴgRu;NyT3^z(=SW;6;d Ӄ+p]iT铚seNƇGWp*NKN *y*ܑa`d': dI=ʼ\[s'#͉pC5lq+ ^{Yc`3^1L?m . ?j%gLHJƕ3긣ǘRH /0 ٨]_L yDfgԘJfå&+5hv&)Dw nc G ͷ+w;JRPwtgf݄a8?.$`,Iq53 iF%9?݅LR~ÏLb|3 >*wڨAcR G(ӑgCJN3{TQہ*[Ja{";UJ^ҭMj9VP>**رUZvhځu`:]6c(%XTaE`7XƚԀΨR=k:_+-z{ώЎe;+=نrs Q[?#mZ'H?rG/Oj){}*kC5 wsIžs̝+&r\ 4FV^TswT6 jvN u 7J\3Ip" ]]Tgg_2`:hfۢxj+sRYsj~IFg /JL qerJdpUXS,p!s^2$&U4[rf迚اCN߹vW2;2~=3ЅBtM_ͧeig,QftkΕЅ&+GY{j +j{oM,8qHOY # xzaNiEɷ`|1,W z4$=AvXJ$'o ?RBoc-^`-Uئ0̳BD4"^v2Vr':qhẠQ(F6|,B_ge\t]>G 1pyJZʷ e6lW/?xtG`Ļ96fۏ۶m߾e(3.jInq^JbBMWQMzͪK 4ȸQLp53cMӖqk+Sˉ JKo 1p0C L*Z bh ?Hin-[ v{o HfJ1CxD?&uMhy31{撩~;kN vZSuy[:aAG9" (jodZ,; nK%55XDj!jhә;WS;o 'Jkba&7L ϫoWPxs< 3FOPd>ӗ3+!IUɼ;0eؙyw;!DAȩ cH Mc_!HADV V; h_niR*ZAhm9b|J1'QWYTYb#=sXI Y؅$|Rꭉb#VBJL{r0[/> k (#/GyAgaǿE!-bE!}pro9{i%\ՆxrFh%OlV9iwjU}:eh~FLFa:O}e*Ewƙև, Bh{y+H^feڦ"30Fc~|vΘ%Ȗ ?U0Ŝ|*V=d~u~B'TD!lOr%P(Ot%0v~w~B'TX.lƨOt%0O`OJWg'pOt%pr).Ox%HJO uqGPJJ^AqeHkˬk䎦y/7n琕" ?YE Yw4Ifhn}Fz  IXvOG4F~T"[Y(t| gzo?|L&N'} 5Oły+yE(68B~56lUoNS( s&bmb͠N)O̩ `"(g[dS>/arYԦ8"shᨐ5tr.U$bV4ȠJSY,” ]T/h?`=ˏ3ZjBŔ)՜J]3O_57cE c}ZW?g x4}:mAҕ"C  Lk6s]GB2//B툯5:a|R4Lj#jK7kRY݄ , AZnr=&l9ݴiҠQm[po?猬w"Y}? rjfw. Woro]UFg]Ӫ(]~hU`M0W\q+4/?駇eɻ-.GUWj'L^5l[bX;`bNKngꮁ ^뵻CB9@[bB 7G3pTTBMBwܾxwozOoP5Ϧ)z6:RBd,vBN8d<?( A@q+YAk!gh}ayF!p}K itA MMT) ߑw7O(<0_ ax ,5 ާ_^x2]`^_oq!&[L?]H<@"(!_ ,.)(_$/fjޗ~3^a8DhZ:B2??Y< m5|0ӻ닫G3ӫW!ΫTjb㒓ǡ;>p̜pB%} ل]΅l"(D$a¿ Kר Ϭi2 2{ZDc!Ɉe`hzDDˍkD't8܎fӇ|𣟂`&o0$X^Krfr]j0{N o}m$O*9x hm[NJ.dz塣#_"" ̀% )Okp"rEeejz`C bӾL FxOe@1Ubs!: ai͈#iIOV[q*k |&}>S(iE#=]؎a4qޖsF Ԃ3L B.#c>Tx+,7E*cDuOvϳV\1ػRd5Oc``ϒHFev+f$8eN[>iI2I^^mtj,\Yp؆+v`x] DEW*s.S- "iPeR9p|S madjJp[otŠENjCyRRGD(`Փ8'*DJm~_y M|:h(,5N{'1#($* cWViQI aQ%RRͩ78= Rf#&OR@T;a4̨}W`L1v3b.FJg$týP^9ʹs%0FEe&K/41egn2z:Nɗ,,vqֺ7{+Jūj0ͩ,VH9BFD͇jG.Kk*EhT(WOtA˜:3չKyC ]m9~i:#G&gqLD %,%&Q +(dYFmE֔eThԖ9Cթ4OxpǗTʬcRҳ PG44*wh>s'vQw{(.6q=hT"fǵ]?/+}Ř`+:$}*S\yΏ= \ՖVt*MY9&ZHej8њ$|.t dvxR;s8NT8Xcz2}|*2Gbj#),Fѯ#R.B^Z.=1ILቧ!K R%꒕4 ӆ:$G8d!'H=D9n6Vpy ]RA]<.Δw21_/kx!h_К*( S^-Se2 >4?l=,l-K%!.^)u?\o> k+_ ΋C˼=K:iqmHeBxcj%(.]@hVnn#F}Q hq6,D}bnLeU crIumFK!<eղv]1Wou([ /)F:E!Z([Uswoecśp;wgl>>x;nn ,XRp>bއ?c_~5<Y&Y{0OwkӍ3p;/qj9J!nk;yǭ,O8:ө ɾ*.ZW-: {Ñ}?|E8{3SBS2z8 0ϘڬosLCi<6^eB?"-JtLڟd1ǃ@0渍^b̊% [9*н'rdw),}/^`Gd]Q~6|Ȕì2 ZS$vհ)Vb';l[6_ :,"">>ߛ 3,A8|>\M<:~ ~ٕ&ɤ;4QlЅdC:'.@h.,O P-E5U3QfB0T{i~`p-\cLUf_m,C#R4*+ZdQIJhS֨Um+&i)n3Igl0 SjF q`J\"o[g3!J\6Lkh$lrSMvYY Rqf XS RSJ&xu[ <+ֈ|o;:aEZUuS Qwki@8KG4CRZu͵6=6hh@J*AXe'w m S׊`Ptx}@2+tωtlKWb;]1}K1Dh23gs|}'0m4J5 ZĖ*NK58gZ8Bboo~? q. d?\Z]j{3á]'1Lj%":ETXXSX% f)uz~&WS%,/@B-XT(MR?dM}2 WHDQ1Mv!Ey5(`5 ^;hK@t##FW@ioџ OQ̀1"ma2r`>yt/]-<"5İ`VHry{1\"T`Im"e*CyC7/O_ľ}-(4\RW~ Kd~/FۅтΤ@cy%5ŴyЦ8R ыDW\qVݎK t.[T7n.X"U7>1oJ KǙ"ǜ\ lZfOfaI40B6LoϦ.f TrQgx.=/rfsoDZ/~Pi@oM-~~GS?:JB0ǬȹO/rXXm~./>\zʒ>yMJ,䙛hM8D:mz7Tm;ޭ)}Fvh%ƌ׷w+/n]X37mJr\wµޒuM^ᑳJKߘM^ÅLF=@D*dc~*L!+!6eh/5V65_j 쮶9iJz]4HEb8E.a!҆#èKD၆+JڀIEuʜM9PyϨR2HW#G,T V:rS$eP&pP 22`"#a#ܻ౔y$!`Rψc QB)ӝ qx!43&*O%DwőQnp4j<7yЊ+RD%p,zrÆSl_`1E%=;caSs*g8sl SkK#kу 7Y+"Z1UG'z ( km x#'$'%ē #@ ~q0eȉvS@N$Ha>Q{Ǡd cВ y0;haa*BNaDzy_Λ ɮr X 8J-WU`4IMל`+N O(\!L~U` 1D2<.He @rZ[ 3 @)Bc[dCnk| (-2A6eY_dTVWOrs$ăo._-Y`7>{ҍ7 -\w% M~3J'?- NߞoA7?\Ul?]_I]%gPjRNB_q9PM}MQqtrJ̨* f[.Re]1Ҩy}!>CGZDE @OFaV >B/d^SRMZn@6qF!'H40ʬ̋)C/FU'LGj$|4Q:ցd*GVrFrVA AH$%¸1* 8Â@c H#5ÂwqI\Nz7@ȔtcK!!;՞IBZ;nJ0gT bK;\nI2!ʼnj2 њ .-' Ez<Q$v]]HU>!ukQ.KwZŠEL]J_voOpMݻ5Oo^6חѯH^=,7/$Ux9?_n非o7/UX?js캛lv8⫇og]~9[7…2DͿy=OiW+^TcC >J ,ڬKk5WZ![n[C sբl/Ė rwmeH%,m1^q72S`$P>L\˴s8Xàtl~s,f t\AVVSV:PԎZ].Ҙ5D(54L7y2\PBGnxƨ!W7UW0]]}XlP֛73Ta͂vxuv[lWHr@zuqz 8BrvN$]p]dvlvl&Ovl+dݳ#>Y$xbښ#$6 bUеR/s`ɴ0e֑њ>jZi[+5x9t!4绕cvJ4bCU5̨m8SRZ܉#qLn$rI*hRCL7EL Dd֩( ʳ) &8*ґ11! ӘX`>2mowSsod'#ɞ\7OQtzeOE)tHIx7 <֜CXF8Q*t "ϜE.% :L % \+$"F `+(=QlL&9 ŰQV:+P0#cwX b8^\` 1D*u aKRbvdpH$Qy*7]n4*SSh H`Ẍ́4D\aqﱋ( HJdKҔڳ"+A!R4W!`hcM,iyTRȄ R)IoDTs"J2UhKK0hԜԜ5 + n4Y -@,gg_.⼂o{Ӂҹ3-[b!OPlkmr`SoiBtrr^1x-Q {D![hBmZL80Y/oI-(Zb|\6K$R>9wy.1 y&ZdS~)Tޭ)}FvL9nn]X37уmJRX}I$Gnu%DdM~M_2ߔ ۻ+sm>.iF9,2'k]ZnF/;KFMAppO*\-Pgw?.>^_FC! )SF((^7cI(Ql.Ux 3Pf7lK)ep ]wiՇEkZؔ;{}t^^P>"Y^dTRi?>jth]ەj2k}В=TwE9L/ia3u̴@GQyCqPҜe|DFFksp7> WX c/gUr$dYo{Kv[YoQ2i_Y "RS,r_!? fnʣO S:^"]w;="j%FF\*_~i&E2cXcY~Gi>KƘ} " ΂yA֏.^(6p,ڀ/Qz$C of ˫kvqAqX% u=kTtn> ?X_b SLp 7Aw==L%E- ԏPkYD[B N>tC?[U 2L  PiSЖvl=Mʶ.\o[ FG/q^1 %;gУV{ZseRCׄϿc\(쮚Ҷqi jMJm!L %;0AL!J {nY;{{!s]4` !@y,ME 9oxl SpRePc# :}OGfѕ 52n6v$s!{,&Hx'B$bCxSy Qz6NEvt1 `F Dd֩( ʳ) &8*RqQ p0vi1=\Nq讱EorJEgBRie V+30}SYXuam4mq0ZY"Zs022vh NB%q7ےV]ޢoz$C)49WLZﶥɆͭMZIe/ c ~I%>'}(}h+NFL: XʈqJIƭg9gg9A/9e֓P9<8R* A;vVqKfHұ6Ak$ +>2_S-r4@UD1gig) LX'h"S5NYCV~vQ!}9A*2"hY𑣍> X*.xXzZc|b"MڗeM~}5>V6!̛SC*y:rqSmgJ&D^pk#-udDuA(,4r!t?x>,~x3yz/:ٯef'z3Iل!^? 9s♓?`aűL'r_cU2fO(ع)Fۍ8Fx(PCqde ḩ_j.)1?<\[kpJ/ ;0pb_eqbr,y}u=ψr6b4ee;'qYI\J16; `A8=*[3[* =:3 ?z$̔5IĨg%q>8>|IBZ#)p'Q2 6)ݭ_v9;|4 IiMk##);_iLPbU(Dd!PD_^<)Zk6X{+6gBwm͍ pjwJ;vɋS[jhW*R{]1!E̐n4ZtRUQrx(-DteGnRv] # a!8"Y煥 +HkHꝶi%`8pEj3(9dm]l7Iyf$̽~R2!ו_SO_(ưŒ\:gL ~viG)xuLx;߾N#Y5bA뻧ST,Ŋ:*¢񄆘F3rzϯg|xջKc.g]}rO(7z۬i@Bp6f 箪94w{N;X$ EK7:j!.o=R\W"jZ1UG=w"h-FzGAg(;D${Q%l%WF0^DԘ3֑!!J\ bRZ<(L24*e9PH^ר~tt nbƕi+I9K0'V9)-[06&zix=jic: Wr`j.A uק獟,CMGF@0k~cv=?yF7 oK:TOOȿ-NTr|j;!yFES) O{B3\j;<M-j^8tU(9:Yh9A0GBWF$9o@7AOHsV'4HZѬկh@wn+t+pUn-¼ɕ%#Fw ڋ0^9OZ 4!?7Y:4IPsKo ޾l=Iݍ %y+[֣[8D#J{P-e=8DebQnz%tw7Ɨ(ᶹID\VV+'Jkʪs-t|9i^BS0~V;F,\7NPty]179N?A`u`:t9ڇL1YzI#-;jlG9ôYm b׍Y~Z#hk0Ğ4IHv';7F{;NrO<{ кVh#"F0#w=X0+ƀ`i躜#eHfHFG$/3($N3j&P?8c| 3jvpuDK9zPH<<($N%]s#ֻU5x8pO|zǧJ ̾uRҡNQ"X)'ym9^gC=fSMeP0e{{eч'%ݗ9m^{o.VWQmӋ"=YRksgǔ@8d'E#A[(vS>ް ZiÑL;Jm+!ٸ}`$y!1L>Tps$#>a&2Y4HMJJG(ϡF+)CF=, p=C%: nx0.,oz G/Va*akgCb0?3G{^󳺼gZWAc0ðz]XJЖcTRDH0 m&ЃZ"^G,53D^aXmsQM Q z=kQSpQe!Y.c=K `&\ /ˆo4h&X"> pQQ˿{_nl?I&gviGQ_D­BtC!J %c;R!r'̀Ա^\K8{P X.X\H*LHQ$48 VvAa}E=j"AZo\ I#Ꟗ?xXRxه%Co&z W ϰLr& rzJ;a%ARX^}Y?ifMG# Cg1a3Nb+_rňZ"" `0B`Ypo%(ʰ Ʂf,_8%֍gHq4F wh`yus[iz  7J=Ozh9eWh:Ύ>}|cW֪o%e%w`Uժ\VG?}'.řR V yS~)~ƫ] ""*DLO{j .|qc ggZB@]x=|Fiqmw߮'ϸBz3 myLQ ӿ~y+n9!yLR%QrQν!z(4}8A KmIrC8f:zt C 5`飉i'D K-Wx1,"9+9%L5n gX<)mmDa^Jgx{x"a 뗏΁81r>w>kxGۅUCZZDڨ > 3p Z]x KՅ)5.]cϴ/97HRF흑 ʲC:Xc"^P k0+ITٴ(/q\ S;ƣZmc` ++qli=S.HQd+iaiZFIi]{U^UBiIf&o'Ҝ?ՇRɪ<$u:Yt~7tr򗛻oy~9ejQ\Ӌ3?^NtTkuo¢Q2[ܝ/Ȕ ǣ^ꖅ8צ\T~_1X5Pc=KUJ*5D~ǿKY,fWa4[, DѸ;(J{4:'4§1BhpRC&)`yTj_\I Ud>npۜ{R㕖>ʿz,$x]**0*,ABJ>zz=m̓P68cr޵3Q)U;>,l֫x΂h-^/tZ0efIHlwIwUzVXDa$޹~,S[7`ԭNJ QTsSi]EHQdWգGOO7fB^o&EYEZY`y)C޼$7EOm)#* Jn>UhH}e&j:^5`/}4}Ys F.WJ>bkWF{!Yl!0$:$Em'! )a 拐#%Iĭhmۍ.-9$5&lI>Fz&zt|cY *DDb"1wn84&ӫzX-ˆŅjEXWF{AAyɽSIW{vP)|*bEP1TS* =\Y-M옣Š*q(@e)/A&("#\$LM&VF Nђb|hygx$22Fq9H{ c#] Z 4(A0)h?eMi*`G8ֶb)_A)%*p 8* 7Sj&b㝤S!(ƔQ.D+:C#*Q%ً'I  y i.bF0`!qg0SC+$դGC/aSw:d+R͟'jZyĸ72M< F(10$K%fV9*- w 2q!jlV8Øe;<(?3*t 04g,(͕E5.æEy4~^xÔLCK6*05)JR`SZ$']L8AX#ɠaJpE,3-x*^2@;тm-p'lI^NӔ,iQ`U2Uܔ^`S ʩ#k68`S-O3;LfR)Wg*-HrݮT)^ >-0BjX4lzwf0k:LWXwDZY#'ozꝔbSLmJ"GwR.\~!/GoZif'1p&9ejtfNUgj,)3rg!8~߈Ewr|A{={ @;j4;g7Fp3Pa~,;FΌ &{^@B_+-ZQ; ռCa9XXb G) SFL3auF6ʠP(E)gon=] 6>i|E%rOs}\F08UuL%H;%̄X0.JdY & Ti1bPѴ&>KSI|hbI>rxD.i$Zs$uZ_grOw2Vh/Z0 F4ɚzzB\9JlEwVdpx\1|G]vo>jQuS'عURw-+ք-~5klUˈ=Zh˦ѳi%ثkk!9_ ܟw-֋F9UY^>޻ʞϪ2JF@kv_k;"?ɩsR:E!O]6PɸT.eiBGW5z"˧1/ͮ+|%ߜbbsy݇/_~Xn}l ר)[Ja/g,Α&Ferk@58&uef<64\ST۱(IJjH3AuAzqZGwUЪ_k!@ :txCQI"U ѕވnj F `G34#KNOH&atNjzl1n-;(c=-=]0g4aP5`xg+Ovr&bS@_x7Nw tBĻ do-wa!'nA6%wn w tBĻ :xGzzr&dSm[]劣zP |L'Mېg$\;]{EOBND[۔fDoA5x߬=W~XR#3RI)WUB2I ,uL:Csn8c?,x3xW?x^˹3Z9!ޢ" =:b<0J/ۃMzMG׀qMGu/JXteӺ1;k!Ϡ?8RiOEom5RiO/[a ժޚ%o6qSuNs_'iXQr'5ƮY7vͺk֍f`I &OH"))f9/uQ`|+'B.`(J% ͭaa֛|~G ^U\>:le٥w뜴mG6˃l%rkyT+x43\,sM` Hr"IdYz*n.fwK[l,cS+8 x+5ee>]rCCU#UZaGr54]&kݶLC)f)-EZZ|?/<=\?>Vl}ԐD<חW_ǫ,Z?b|~e>x(RϤ &U>IL,e|{>w<3qHw_IVĞ/Ob߱~ؖQa}6clCH $"XC0z+LyC@f$d~-s2@XEq=h"\2Sl N\\1fjZ0pQcjOɻ-\JppAK$/Uy4G9>tu0 y/>~ϳk!ddl gV)FQ$R|<?d ɠi'; $goy֏K2yqs9y|cg7 vej-ӔcNB0̒/QkFy <RL2EUHBƌ񦾇1v>!"J>s(J72`f|IC.0W*PE&Yb"RI(-1Y3FJME R,2q @?4i>71|wlUٓk t)"3 f1O˟fwS`;SטtwgeVs?06wʝ-wrڍ}gxn hV$iʔb&BտkefTkz\jvB"+ ,C},XzcjN$-R:v8"1"dHG8BlY:öv1bJ;;Wqf\F7K!;w#N֎b{7WYvkAnsˈ36Җ\>2m^mm][G-$D0c/۝f| A ;Ar9kxv} ෦j='e}Ͳw֋eF5BjmrϾ+;"<# v*5X#C*Dp1.nnsJyQ"&b;Mq`p@>wѩtn=[ 9q )DX!^XBS'}@}F;iIoN7<ư7bwqN3x_ ̝[xwa!'nA6iI|>::I]^Zxz>f8cI!{? Ԇ~_,KZŲ}ys&PL>A%}yu&'yp'Ę GO>O9$i{:>A2 }yu&0q J }Bԙ >AJ>!L3a}By' QgG3a{7yp' Ӧ䔠fNB=O0^.g7kbs+ BpRɄr $Ita0YG:N.U63I߽f9^eX~EDߢ5-G t0-)=ģns2-uG4v nN^y;w7+J=y-mg;cCdy{}y%yҫlWޒ8CmH?j{Av966 ҕ-%A}QES\?qqFڕS>x*iG׽st”Wc 9nZ.J*KoSvZ9 "yz]5(#C|v@N,O}s#{ ;aas!*}tK"юX 3c_fs(_XV ~U.~o|K^-gyQ&OʣJ27L)sL%Mtd&!ɑ6KJRpy)T$%n{:D JZYZE$K*Y'朥0VIT*S0 P<;h)OTU -E3¾TULdpUsTH,KRyY0Č(Y,{+| TU^k e:V*C([Z`V5z (vaD9]sгO FHL'8c"$!PkphxDDF+UJ0&H(MPb$QdV$9h>N vK_k }m05*ycn4E&SLۛ"L@`Șo9B?Ķʵ)2Lp dnO1kSd0.=_X )zS "q2biƹj>_7 U)mAo~n{2p(_7P҅,~u N.v>R-G*#s~EL_87=u[ m\6hh!}h\X q2Uncgq?m}o8oۦ}~CpB{:\yS^O0Bs̎tI2%Y4 s"#cO`eCi*Cņ"]{?t\m|q1#hQjy/y+qͻ^Zwfl>sFϥofwV3q%Cq=wpo.Uq7 Ht:2m|-W<ymv;45ќ(|Kzk dTyOWkx[[ޤOu6"[PEB$L}DPn2AO<.lMzzs('GPDΉ-|Gœ|M+:2cQp,cfNhҝv"2 i0y] aJt)iAp. 'm$Rwم) Tᮌc>2ZKX4\ F*/@r8¦t$JZ]tӖiugH/El9/j]V, Ih^w"1=__bVܷs(n9h'v+-AdǸP qn-n[RFؒ\@FVA`B4Qf܄4ۣT5q9c~u NFU%8l`xm\< )Xt4?J2QXxPUL8KF\fYL8@`mBS cLk2dPY+ ?_5F,ṲSx[уj.5([P$i8)iў) Ίf ^l 6'} rm`wg/WSm %vveNU,L ;XO/p[~,;ĥʦ`I{WK uZ\, )k) ȃJMZL2RbF"8D&FqA,4_|ve/\kK;tz#SH+F+{>AxW :#8Jܸ`W:Egw[ܺ!~@4[r: 8! Q2nkqlWї'SUh4L l 9$_;wkk{(dӷa)WC4OQqһmVt7Noy-^_q0?O7cYE<}}+h:krgpY'y5h|: 7ěKO9vKVΛW$!'.^2Kv][/.;FbDTƴKGVzAōlh71vIt<(ڋnjnmg][r"%S]Ooj7J<1p1Ji}$r֜vwnHȉ2TTϗ]ݡau_= OQWho?[@cb i΄^L 'leu w ju]vb%n dݳUPY1'8nK^-5O9^Mv ]N&\2-O~FĄ.ZHJJJ0/QX 0TB$J' BFı /)&1bFB*Xy`ZɆz-XmkhB6@HD 8Rl!!a"M .*0 BaxjIC?D. gd7PCa~_m/x鴿8ٺKK1'zNa61v?>'_ehqa67(`f__v^,zԩb/UQ)l"~ <Z(mByjP '2M$*UN71GL&RD(Т!"!! EL4 qD0N@D^8*d#<YZ\BƔ&HjБqjp L0JA8UwfEDai(@K;XfV?@YdU򄍥ȬLJMKG߆ @-p.1 չ?ηU+&\wO߽= (}\y`5d?0:3wW}KWh/?^Lg yx{ 6U>Ae7[3]tc6X`|wp2 Rrϭwn79o ; "'RsVQF &_ÍT "5F4vx,uԏ<[(C L5CkXQK{%"7:) (|;5qyZ&هS-NR% J&+t;;8iLԔ؄$` ihN VaƤy<]7}.*-v1l'l3/riXyfTJcjxh]R#ho=\F~Ic%ƒeTgWfnb8QSXd~ "϶X)0ԵX5N䁜eg3a'hRno~7Mp_3B*bJp8#ni 0*QbU $Qv3*G0]GvO#8pX+\٦Nz"i$Z i?-%k=%B( i:e >a(j]Oyc_q KrrU;-u$k޳Ntr4`14&Li&ݜ>&M'[%k@^PڐC9 Pg7S5Im;- x>ߢ] si)(h)xgW=Ӥ|PtGZίZu׽>y0'JW]23Vx )8"\j%^ 8 Lb%$bآmMڰ-M)Q]-p3@g6m^N(X0`8EHs%d0 ܙ0}Zuwf^`M6-4#56=sثϚn8q`3 CmyKҭmYbNJ9:)-:ˉ뚽7NxEx<30V-(d!)fN'#*D}Omx2;zIf>@QT@MuW01VLDIb*h1P$m%AMM,m.7+ƹ W[q $}jK %GNd{_ot쮂P0qSN\^QH0CNm]v/j3ư֬fg\Hs&imi=&%>xV&LkhW{6)7 F2Γ`\aa0aDT ƃQ@!A$ oĀ]Z, 1ro;bĦ'q9d~*;~_*HgsRǒA* Bnz7 >ocU9ykQuj>4GDmIӉ|\ZL&ڊ7!HM{l^yqVy^Y!_Ww6(]L8oV/+{Rp ՝|#7SecGjW;!mu8/A'H4OWk2=Idt{̚k4D(q J&L@cX5iCWP3p3WﲟHpbdET< Q8R1It L*$a-B猬"&?IL O3픙=N;F%^bl>> miqm8pIl1b@g}qu/~k=<3Wd@0R=~M<}LAiv.QEhpC~/Mpٰ;)9 v;L8 8!ޭގƣ闝Ӱ!r~|5)_`x ~a,&}rDʊuE_9 m7k&2-*\?ʜASG0"> N+Ts4 q^_uͣE;J9V=QDfGDX2 |_2vq+2RDXQPT0M8SdLL#1l4#c4(}sd┎ư܋km#G/N,ȧIn1vn8a;ٛ-eVk`IUUEV)a2 "p<|C=  ?=,ǜ=e+j]jr ]g v?G:LB *9;De"`)gKJv-->8] gPg{tdj|)F(:0"'ž9j9jLj0 2+x))19e)sN'eY6UZʼg*B j~|-z '.݀'?t&)I~>: &{FlgJ:|

pGR? zMXW"ȶeS{AZqmZ^n u6E*s$0"JP!a ׉JYp-e}>%vuBeCY9o6pt?0gDDmq`~H8͘=82Y0 R%,jLk~)́rPcV:\Nsn9Ŷwr۱2W 8lӟ.mףi`M9o0&;#J4(TM=zkٰf:)OZG(e35% <1i :cD(٩Iu[xp*T/v3ARvw%g " j>_Rp ]'KQT;beLC=:jyV$;./p] g.ƨrimM_nJ%[][މ?OsfhrM, g&P܄+yr)X&%} =R Y `See=kl;BO5g:$L;pI(|DPEjmȵ3Mpgp8(Q;k^,5-JAH[lA9Ab8CJ3NQhi2f@$4*V-"h@jO7s@Ge3E8%ۮn\2NЏީm8a-G,m1W~\ ZԍI܅֢Pk D I;-/(閺Buy}~,BE%vU2p.6%gAswo}ύ;3w9} Y}OOȣӈP.F_OA bgik/%Mж=0 pG⊵>&,ǻߟKI,bz@a|ohPLTcrT#8< {|`HXauJ:XAFO$jqAJˀH'`L a<1 !A2@d?NEgZS"4$4q)UXx/ EEWɻ;ALzu\5Njt e>t(s[W&I{jt7r3-~gb\M鼸і.o O6E魗6oFBecm2`("E!])D$pBz-M' \J {+f(=eAgvzWg`nJ2+KZjMCA"~+Z{[҈FfPX{.5asخ>ypiO`eՖd 2F'S 9Zd\[y Bd u`+H 0ݒ)𶽔 .5Xsָ#yk֪;Fy";憙 d,c~{NZm˄3DB}ÿf?i7߮5e#_}Zy\F_n0XWfە-Bȫ+EE17(>ֿlrG4J{3g˗*o' y*ZSa`6㡗ߏ0Cw9tz7'eۢ$(y6ch?".;kNЩ}K_Nv_ו˧70_~=_~!ǔI6knh m„?O19Ayn0Tӵ8D8xā3i[ġfq0-sm{ı<4"\ۂ\!]#gx^`,(u0D{+Kk, {Cմ=oux۟5gXI3{vѫ[$4DO2hJ\‚~B?P~2&ZВwje9h8^0w'S$cxaz"S ަu{gwߧp^De8OFO.0i{)JEa2=cކG#u:k7@ZMak.r޸c2p3m ASL2cuo$` ӝ*3}]qj.iqy)`W7_y{_|;_IK~eo3OtTQ:!ʚ"P[wIbkhD*_t]Ic)C̛ KPV4'T)3lwd,n]Q_x:N P҃ `#))]wۧRJH<].dJ㉮w2.&ҘL\cD G S:̐ މFG1MUx1х 0'j¦y5!"-!"-xͦNj.Rˉghi|9,I=AUAZP-rcdZZnTtv5IʝGkwRtod3!Wj,9sftB (9)z';-z FBA'N%Q3L0sVllְ["ʠ蒒!ESEYh;'H\FE8 \U(U&+.'$1tO: 7̢Mv)~!єxPR&6W8[`d᳦D6vϦg~fvL1kiyj1(!} /,F)мD1]!/QUJ|'aL@MWs'PO<5@kYK3G< <3VF˦/1G42jRUj e)1׵ct(iG.Ͻ =>4䍫hNizjg\JR1@I8[9D%hhVLZ] 7qџ"҂)c@Pm[LBJ9k1GJ).e2[M/&xi;GW{! !C4)I6Z2`Ac grritpA!8qUD,bBx-z&@$yH(D.lՊ.VwThs.o[ンL\ 4Oԩ` i}w{ҽ}N`|_c|]#E>]4pL)KFHu^PIQ1$kI31yS!hI v*&\f,"JȜ`7nbNcsA(SҤ=f:#pa9e !|JQڋK&-}pi9J_,9ڜ>64}<(V* $j.,(T) Q*\$.h`> A$Ad ΝeX@/;CYXj>7Nv3bt)}ޚP'IGّHaTA"è:&=0Ŵ>M7 Nz4FkV,nsicp˞LIDυgfn s:0~oXՒWًj'PX(.ӗxRpm$c4Q3R*YKB3^M[2w(VS\{CA8zj[uE0tLʛq8lY_&I 6zk o%5Э䓾----ּ6i;$bC TZ"a6RAs-T"H*)U|+Y;Pխd+033-dɞSr9YHvjQh儌eSֹpAB(|#Re llW+6K-A#pH/*qBh{bxgY3 C2jBÈ-'"CU{ 7/Bi%l5R3n݋LǖCwE%=fk_ J~oVCw䍗ᛗ=] r#h|EoMS͙P_'J{ !}R2=sTTrcqMYJcqMåiU*I(pغ.{yix"$g\MIB';P58.'!ƒZŇ][oƱ+$9 $6qCheWn֌jH⌆&p$VCꮋ; O>ճhd:v][`ٳ}$uEA%f0a( )9Mˀ:`c+f HjeP #Ĥc7e[ap<Yٳ˘m#ɵɘ&sC/CQ`,Re % +uPY hV}=9f3X :9vi݆Z DoAR j @&@s%qW+S#$0 n UP̃8ibC4g7"JP8{^q> sޤ–>0iKNq(ep@H&6tJHmԖرu]W޿{,(X'sA 'Rx%RbĽ nfY\1C-<C5PЀ$m+x2HFC6Pc.a51ö)⺤X E=/y4([L NGy#<"RQ0@KGv?Ủ20HPIHƔ#R&pNL9l+w&5$z"-(\`ޓ!SyBOd]/\\_v}1e: -..TQ#9"{JQ` PW{uyfL40$N`o5הSnUL\"1X^ ba%^_cxsD1VDbp/}u&R R&Zq/{l*v^ۀF z3g|eeyI:ɖ߳=p0+(("di"jlC٠LNy01zأ1֗|ztn'@6ULJ>HNґ"{Zm"{#{[p!GqOJ?& oB,$y&ՒGaӬH=pU(*йrg1\P8#Nno첂D<s~v2* .%ec~|XZ~;c\rwՏodX1ݲcz)9:8U[.|OE4Kj}jmj7T[.);F?6֡zhW.92Uov wzt>0a;j^k' +!ɈQ"U >kgۂ!RLm!%x.Fˆθ{Z۠J(08\j[o%[W. ܼK +6_~˵۴sod` =dyXs ɯg h(tAUq:Z4L~5 1 9Lc]}·r _˴e>bLLw|zu͝ڿNz85&fI11ĪBCs4K4ry1s{g*kvb\|tv>QR2!>[lo}J:F%¯5'MP &&zw/`q S3A& _mW{ބ +D~ν v pz`fG(C |1(C<ޯk͎xq#>#6 aH@-qiD»`.m`3 B ҢN9yk2ђ[U%ѝjyC͟ N[eJϩ'yCy!Va-t̕R+$A+bZ~Rki,uNN 3,Fk bBNT.)rhjZxx-^#}H+,fLU'?nX,=j\ RD'w6m,Ѵ[W.Y2%ԡyo\m{X "g}^WO3g1%`o5הSna6VD(c 0JC$ \˰S8n*حa7NdU [^K%OhKAV@֩tx[v{M/3aWj*TLjR@qk6;VUbW$>S l=7bcF^Kɼ^0{rԲco,6J`ą %st`YRR/ Yesd+?k(p%ք\PI.) Q$JK\s`,O<9,5NAG*4 p/έgQPO }YL49&g1䬞hzEIќzPVzUZkp\ vDF@ lR^G{22g wӢ+Y9kv,wA.ZSʫs0 LĊ&N*&. +,-ILhP!T&Du3F^] Vwm >Eh2E^A&tGo9DURZGKo;.eо\ԝO*2W/ی&Ւ8)f\c.(MSXTFR^B-Iy&c^.U b( ÑVKlIVyGV|#UOZa> N[rT F KSI%%X12 L&VV<X1zz=ĴZw7gz~=[AD_vU^X==swK,zQ5ROϏj̫W|{뫏~~axQ Zԋ` ,+8tr3C};fnU"wW n D#^rvy'L>Lc[S_㖗 HtWRc8;MΎ -h EK,ZguK~$!{/`#Jz-vDs-i9;\>3ŸSwbD蹐MOIʛNUH&])-rj+g]JFj`UM\&̴xۀukZjFi|H jPҏ.Rqrۗ ٢IzZ"iEi,XZ cŵЫ&9HF'{ƒ)#1:Zi M?Y>c*3|r /^+!mo}W-j0L $Z2!:{ K Mc1%BS>y%zD[$qD SX-:K묾y2$XRa=6`K ļiWs'P(D@ LrG@Yyw]F=6)E ,H+vZ8`^/ <^)4 b!upJXbL*x mƸ$ƵZD GG)4aoF6on,j8>}D Q쾟 J~Y[bkGJRVQN>{suXqcq+ ok $8BJLY3eC[4uP4#}4K!=ܶC+2%e{"S%z"ƒ|]>{JN$Y"d+FWxbx 6cogN уy. ^"]{"Œ Hܒ7o..(9gT㿄J6ŬG]4Lc޿}v{?[Ov&$‡ %5Z>ftI5vav` 닊/x,[4MyqWNW.O s8ǣ7ICn ZAQQkik@ 5.& ;Pe*!,gj)xy7rƨ2[˶B;r1i~9Hk#{Cz;1bjQïqG-llG"!!p1"mi4;y[:Ԋ2:ކJFeg]n!cP'U؇[OU۰3EٸXxPۃ!mHyN'f!Y-QЦZQ}C ;^gPI>);oE:xVOr W{d~?Ճ{țvYQ 4'Px23tbmBwas)KQ*%yOA7UxxC;.Lp>M*B=:wfyYxp-i 0%s9YR\V B!KCm)^]&5k 5@"V~kL!=s 5F$x6;6x}ɡLJ9hd~v49]YnCp69`/H,@Zm؃ _ tTw3ߵ-I(L5~$$Fd$ak;ҒKҒ ciI,w\NU>R\a(g׊e ]GT!;oK'v9gܦsX c8nUWpF39rT+w`pFuw/tTF{."խ Ѻ tm_0}r=mz!#6WƀR5p}B8SN'0,)D(FtDwΆ&8jR"\WJSX\rvU k5Ģsr$.sT(N;L )!KYC=(/,PLNm 6Ԣa!6 W]L֬9RR+<+YQFjlA q! J]”s-aDy"XX7F?Wjs9 z$>$w}`#$=FC|y" 8@%W=#& 5C3Y$94jLnI̡ Z޽z^Ӄ3z@%C5^:uWo8F߫"ƒzC2m8cnq9rnsq!Cshdw\7Kocr:!+ t~u+8B |M IXE2b&90 id8U'Vq(*BY$⦴ @.܁0|@lbDK2A5-̚s/x Vm:Q:H.S6,nH+P-Eک ,>Lju)Q3ATE)YPD̀rzMk^!OkH}4];%3 zIgd/?p:Pe/<TK1*: RK#Js.aGvT:y38 YwZ~9hk-47i`~,zƖ! S+:@pG[(^-lԼX#+J:lq(}~zTbWT'S4%sŽ hm;]:Wx&'-AZuH.Ée/F#Q^iA=Lb*oKfl#w X 3)G.(`CrmxT" bc]p K26jnp*:h$q0d`mT %h@w^c%0GNk?Pʢ Y,\%*+p[s .9)"gmNĻL&^ +߻ۅ\]o+RehvBBpf=o lxb 8]y@sB onȿWvfR}įb.f)}TpS *S,"'8-),(yg4ɣqZ^󫋛pc)߿ U]K+#`&U1VJ,m{7 Ҹ^h=Z5{\a2 3^pJ)1;0º|{`#fWцiR&M9EXԀM fT内29hӨ4qdz"RT7Bݗ <ÿb~`="1<&)B?6}98D_n}uȾ+L/|/ّW1fbp—*&Ǜ_J1 9k!~=܁ˍzy t \_GBb:{s98cݯvqbAa{F^>,Jwve6(ސI%UqZ]vZ:wY/9TR>@w5\呃0Rs$Ht+z ԃp+b3hhn9%QC 㻻O^s(pFjs7w?cH.5 &6_ja (!LN n_R3+? Ur2b[ȓlkƌJ<|ziyFg]6yv< /~2“ag-^]XŚCP:ep.y;xfp*k2(lD$ {9*q}~6(V˛QֶNb7Udoovy۩/}b;w㲖O6˧[C-FfF] W3L{ewW )I~O2i5–sQPE1jȕQm=&#Dpx- `vM\j w& d iu>c:q>y0zxM6}.n M[{yl9.n蘅Ԓ<-lZ[d_dx|iuWU,ɪl!b8#L۲= A¨}vE`}2CM^d4؊+YdHQ XRabNkXgF4/!W{DY T!&IB0zji$(.†c`s0*ΥDJIɤ;tD#`8&,sX[[T4Q0BH,a>uBJUja|e`DJ(أ:QO>4L䑘#"29R簘86bT ֣q>?S@7A _3MEt/(/n2q=ަ@A9aWaK A"%;>^zfC_Uaa₸ ^Xֳ 7{?` 3_W<sdnx`(sLQj +\Bd)Kb+E>h% @ӛz嵰zwX.eNRuFe# TH4L%/TC9[`#u%2b8zphno_V2@+ ྣVΠ];6eF!\[{+0," =3䘍0Ҫ/LF݅v1& &o9bLEHQ}رa1}*x_y/=qD}Xm)ȴ޵yTD?FD]b*nQg궚 W]eRA!8 ڹз8BIY(`W-T(dh\*(TIAL^OT KAL"):q볭BccFCH eeI IE@L1E7[> Z9fJ!*nl]I]P̄f^KBturզɫU>f&HIw9ׂKM+8+ @,9S`+B(wƥI,|Q])nJL` Wh;vc.E8h[S1L}FvaNꍶ5Eci[sX7$Rlލ4zT bL'mۀ {n/rX7$ zDtЁr@A>w tғA"[ y&dS~=[*16m=՛wK~XB޸mS)p7 R308iL#7UVS)@TD1Gp5 Du#y`Og=J/0Es؋Bo3GTF7Z(GaCz5Z9Ecn x NjљRS Л?zccY<~>gs!:JV UWC3}maHX1VY`4VRíw7#m%=u&{PrlMNjs ؂R9xˁ<> 6-1}ۆ4x0KX:Ai ~#9V-D9;uk/;ᓜv4Mnw_N|ۍEwfWzQNyyT8hĨblUN iSZ^>Gm<Vs#}~^A{e8hCQ&u<h ̦2^ThԖEuHI*J,M}PhÏ`*K(N셛/f!LU=D YZzX,8K^ Ka QR$..DK{>Yl>pߝM?yOx!Y3% `g|T;gS_LWl@<`%V>NDS))RDZYQ|R?-&KAmAK (5S%C988;S $PfXk AKE'bu!Bx9%%NJZ)Qp.1c> N9HCU+o•,k h=\҂k̺q3mƄXTZ҃qF~ř՗?[F&8?3Iڋ~-Q~QP)(9GL ba=TYϚ~}󀪟ZM \j#)41\k}۫~qպ~ޯ?3|pqs%%EssWxw;< sa/afuqhtwiW>3HLz >.*Id1TUT! x3o6ڃ.Cݷդ pHZVne#!0cٿ"l#z:V!˜4= , ?ӵURDdeTaYdJ=Mh+15AT ƂHEffJPEŘt@*(2$!I3K9_ޘ?/mBO*9z~JFʥn2U^xѹF9[v6ruYw1eq(k'xc.(Y dF/LUfSD-QJ 2}~4 ҲbјՑ L[|( y+aм14 c!ޟj˯6Nz}԰1V%\$̊yWMDYY/7=mUBތV]-=G-I=={2.Z[k$d~Y!5TrwNpnނw2IDF}}[Mhe}U"kWcڠ'ΧtB*b'JR%2V(vTLVUL8+ ה,paa#T!hOR)0i% )G|"9#. O9LX[_Y7>tB* fuAw3Z&bǼب:v=..k/s^)J?19緫Y5"U9d۟kJ=&WzDғAl5PL]&S'<c >׳R`왭" iHM"E4i-֫:{TRW0׍[ߌ5A\{L5Va:;(q3G@8B5BA>,>x`dcDh! [X{umzF R سg|D=A "m tfyP&a[?;g׊٨]5qw>q>瓟f_&;0+a$mqNF'W[+zCh RuĢr޻/[CB5cWeWf1^R*y5ʈrTcVѷsepGG$טg6-޵(EߊZ!6*Eq+C*Ts?yX]n]/8v>ߖ&§~r_glV_W%e21;A1IoP~)"'#2CHn.CtJe1Q\qEIO/Gk6; ҂!zکH+k*R1W=bq%1Ɓ&,S MU?2YX丷Yi7%weHzܭV=CI X Azr&+z =!>P[36@Y[J1|:P$*OmZBuP*uBPn*6S[t_W$uh}B%︔>u8 N\WS, VpϔYjeJa+4pZ0=5`F`OfYҷvH)aUh7p# "i 0+ X gد#)*M$ ԠR 46( .oPHWT2DrxP<٣rrPV4E.!#9PO=mTHj77($~Q P!69R Ղ!y+x4:}HB>o,{nׯzdR-no`YĦL.|S49MW(܍O>$HIK/v2`#,$AwydTq׷W}M#_swQ?yS-=&һ尐7nI6PzrʻQIR11gn=tYpصwK}n9,䍛hMQ9mVMR4A>wY^d7p.KINRVE 3yr֛?"sa8UeA׍F7_:ݶDtۻzzr&dSw~m[j>{4M ͖H:|#$|>%5_Ŝ o[jx'yBԙ@=<;yB'Ĝ }X>O`lɘa'yBԙ ǒ'@(>O98]>O3 x|y yB'D Bݍ:uD&}Y $S3 O>J\rxCY'."8IY AF?T wڡAwXB.#'u[ c>[qt;>[dw%m}[+ qtve_TJ(_,X}~ 3BɏVX̭Ԣ^j=G^sL@f?/ 5[k跮/`V!rD2HzQ b,*M&tfm DC@$Znȴ̂vOLJ*p*SB=g_x\,؋[b1oi5q %YJt' PW9AFy `4i2L{%TB<1^L 0J<:R`1(oj *nw-ggΊfwŪydo̜Uȿ "(,!0#V*?BYwd^DHW%P+_`ٽVAP=c:]1v|j~xe?b;|'`,ř}kX8D/iXOxˉCΕS9w\y./LM]f!Qnzqf|KP$  n ,z)kkw1\C喲^vmC{j{wwdj^pv&MzOλ؛ O嘦%^1Fw`2*pg:M-#ug%-OI W? 4 u헺9;qsh$(߭|1DƱ7T`Jȿ"'FD[ ^YE&bƝ14&:Mn{)}P\p֡tK[tkmҭkKno<4`NP 0&g"ϘS,AfhSrL{Ӆԛ.tV2J o)P2 )d1$ PRS4k c0!ZBqJT392'<%` yZ OEs# 8w4/rzUf^s ͺ3VQģIM1٬~/*_]\7?u\1u<;P n"U먚?_$; v1DS%A}APv ;`]:^ZsZ)B5?ғ)! @k-¯N⼯^/詺έrTum ^'`."m?@ EM֡2v}/` pCsʫ~{B^!+3د 5\ l rz"T *ZCܰ'?_yd-AigH.!/*Z7i Iv<.6Ye58Ul؊P#*63FPsSiL$H,uQ`Pau w7 r6$[ 't*R mrT'T 9q ) ho{7qrx.QShse,M4Ȧ`m&$w tBǨZ}9^ bXȉhMa|z˻QsB111x6H(1^bXȉhMLVq{4Qg @H->:$;s/ ܯbO:ءm m% }c&H/O@yB'D Bv|y`'yB:O18Ǘ'P'yB̙ aǗ'DA'yBԙ88"w|syYW2al[O8`ofjli3oֈ"4VN)goT䮠lAU8M+RN d%³l+hKJ* qTյ RǠ9oj%ZxVB񰝝/jbʖ$1[RA{c0_W*rV<|EJ8xkLOv2_Mf}b ;X ,$DJQǙZi41NNr8q__i,%ek bQe*>NĈˮ+ϧ#!0$ath2 -z0_T\#jlSpuʅdL0ͰT0έHK$[|ZJ.zc7y\@@CDfɐ("a 3aҼ+؝2 Vz 8~@J/K 8MZx.^xrUvqfT/ noB -"ݤ30g/RjwVn+~v_pso6O,oct[V[t(UO8O>jC$jBCt.vxd}ظ6OqkV.A]vPkw^V2V8(rF>P6 edTO>@f ;L0$CMmxtO=c@b c@CD.}4 m$i b㐻\='$u)-E%b.jJ$_,x@˝rJ,M8˓:+KϜڝz)cco9wX%( ^Loa!ъLr j$2қ72ȐF@ @M *Dz6P7Ϗ\^Ðb}a` _zcR*Aݓ ?}M)Anߏjd ('!MQZؾoWd4Ԫۯסfo3@ //@_d +V}缭d.;RX\]o^WtΫGe0 T S% 7I$RD' iyQ"E tuiL&{nhPF&́$#82m$NYHŒ3bLXS_t ؿ\i7.~i5]=kՔX%TekmcVuCyw*mqZۆl 6agؤFסel}O%mv !ZХ̤|w)v6>{cTm17:͑'۞ cXȉhM ;&nz7B1,8Fbot*k#Xȉ6t:֠d<{Vw>cKmֵ28vYE:᐀Qtyz|\+8S{YHQU0R8XnNJˏVq{v P+IP˅ {>V&z:ӹQl'Ce½U ֭]fCULKE"0byQB#nL*@)X" :dn>_XLE8޴.tWEjAlaϾURݗI$;}"Y 2#% @po-l<'ߕ[ims|<IHJe$Ъ,4XTe0Peq,52E?GDx%fs83F%E. -s)KpJZU`Q.*S*PJ*& ЬzDK0 Ve6)BeU)/eAVUT\Z*&Ej,P!Ђz?Ad`:hu(e~,~6\Xme};}cE?o'vm*Z~;k*5"C4`||(Ǐ/M@\q.~0j2WzML̺KJ4C6VC__=m0ѶԞ%BE]NxI1z/-MH. ݕ2.a]ixk:AQ+F,(C~2d7bB;!滺*?^P~ 럷b`UQZںiJK8ʦz'z,^K|~_fy. F3)ٍ3xΓշNv;.;wv9M^o`vW[-XJ3_[BLGӁe|ma!nqރj07sSnߒ㡬D] w,4 TԂƢ8hEWt 1gKoAZ"I.zrmN{OSƴD**$ R>}چ\IO9G'pwu&"15v R1/9oj@9+ie1P ocFޒE_`6MbgNq|@$I^ևFyΆ󒙉8J9\E:q2.p^~rz姺^c-iY^\D +"\9F@^PydR MQxvvHOe3ݑ@8'i$YA)ҼdTb#UJA BJ yTQHap& >GxAw2R,@(8;G"Lbgb=;iw=֖UtXYntqkˢn+i_]=Zkxvׁr07f`a%^|s-p+.%D3,ղxbe\멹r/ geV~TeSe ]m-jv7#JӒfVDƈ^+3 rt3زqrj `jF\yh2Sbd*BJЊ5.Lb*EuFVyY (X| ;X|3Զ#mN|BE_I$ Q5} 9'º\{q42"}DWCTb9KK!rUjTMHH 6]qu8P^&DBWGBO 2,ݻaİ5ë'; tƄu7`+5WL1CD v4 ۿ'fNSdS0эM.>T=3?&}ᮇ}aK!`vhoQxra=' Xeap!qO9x 3j|dKڥY'+t}2]N̕殳7&|X֩Cv?v+۬jIg0^wBA#݁]߾,{yQULZ8.+p9qw_l2-TL^euVWJX 5.zpHT 'Q9?1D HŨ,:5 KPX />yf`>ߠ` 8\( l| d%B@;0uN c}hKͩ?R:ֿԂKIG>R W$IafO Ć!v|>lu?dd#px6q` )4/Q:%#SnNl!mЬEtטqoD%'SʖxGTv7SœLJF_lxF9 Hx?$G'}^^(Q# L+E*^0tr$&@s.u EUeA@đ̨5+`RR0FBQS CU-mJ!)95ȝ5s8G`>"P z3}ЯK)ਪ3\B("E0` {CfR ). 3 zmlTQ(2^~POAWZj/qkZª!P<-A#3-wT JVȵ$Y ,Z$Sl]Bf]/7d9m*!M)rf(Ԫ /҅vkM2-Mff?Cp_ϩym?|e5ɋRdBs;%)EP9gUu K#** a!1(7 JVJs^HUReyD% WK&La s%{LcɅKS*y-ͮ7?6"ocCo[]w. cO)I9| E# #xõj1Ɓ0zqo;0~}oӝNv/@)fRcPVp3|:uU>}};фlj~att/)xן>g?lZ2SI:T>qSZ*:U-vKS0Eb4 GX;ӗ 9dʠ> 3_-\I }sE6Hf{I{dLI]F>^/{Ʒ' 0es/WbtA&1\5Mh!&О9d$U Nԍ9MJ0 v G+}{ N^o:Q&Ȁ xIf1I0ʨ"v?Dn۞b>ⴄ .$雰SvZv;i-LM41;Y?;흸L\#CG-LƏQ|H?ESX$T^ P˛>P8!\ IS,ͧv&pD -XwAh\<<$@G"rb'\owy`Y}s]Cw;}u1PwG&{V3r;O!LdVO֟w?ByS=ā`ذE+0X,F@HnW5̥~p@<Y!BdA/~gy3W2c_[~<~ uwH |yv 46ChVs,J^Eq!\t )I_S?@$#H-kGtt_bD cܞZҺ N!*uz'6sfY?DHj$Ix:BRxY/WXBP34X $Y XK0( Ԧmk;VS$0IO_e\=ι;5-qhYIi9}WE#57M=7Ad@sd_|]Ye(1A=5A RAʗ܉Y#mK SMò'6.kE6eahBH\AXP'ݿĚYY\a.־mx9cD׎rS\Z zx>mX׳'|.{i_lL;#!z|zYPF9U 8 $+D_"Ӛ? & (EpXd2z1}2rc%j+I7qFe!ss*-Jy%Ls@BQ Q!fcjB L(VVZ .Zg)S -C1  KU Tڥ#UU%!UrJURx} $#VyeL`xFKTdP@kHH*i5QH8@FٓN a/ kWYQڟ2 YJ/E~gg{oW $bD{IncC9FRqߠeUIշ$ a |*~5}cQ"D.|{fLp`hMINK= rخPHr{@ak*ȥЂS91HW#"#5Lj|~v,q0X/+xL`0z?^(]ghS f;ZMt.Ud "KS7E尥)CX0Wq1Rj$AdbZ@Mg|T[{)3Gş2Js$.`F2C42+ 1-q I1)AF{9pl ,Ebo(RqJEۧTƐ͇֨ۡg@h';+/wV?tO x G.΀k@%'cLӉ)pt:1y"BMrwj&~j&^f/G*AlNmutn?ȭcP)TpG{GSŠ$_sAed!aD2%ؼ -6?>5 |IPZ9Sjyt*tl61@ٿGU8>&ػWP/urzvVU+3?Owm͍J=U~ؚRLes^&УD|,$ R;(^DEJ23M6n4`ס uW¿qz,ǻBk~S8neT}F.yUi|wDNT!w,EyrU{/xs"s}&/enin~XȉhWqx7 %GnU11(XÅ(Ti6*4ֻa!'n۔@BŵG߳Z*MSaK%$Y&Ί״י"v1{پAEoIP4TsunZuc~ҹٝuh8'N_4 % 0SW&F{N X: W֢@ټ8ƍsД5mUծ:|!!ɸ7*`e6So}x>GѼ28-4 Y6ƃ\]+~a>Yo+u>D^WP49ӕ+d'zCH>n<|5@6Ұ%JɛdmZC}c^·Q(*`0n5|Qt}t{XB"H}C(;xG̓HB,0³t7`Lίz %緗7hwUwJߖގMs^OP,kB R(Tv֌7xq]4tIHQƺ5YWtM)6 \m%V\Hق55yid.r=m/Ab-@6,yܜtQQB-}eA`_5WYrᴳ@-x.4R^zʆ;3$ix0"+TFH(7Fkz>r؆¢%]`"p׈eNow5-pt1m +)Jq*Gm fS:aNܜn)Rzx:gQܡ"S;4nTaFuk"\w45\T~>6خ%ҍ]$T3ՍxF@:P_%x8k!άth3P)Ѿ?o}E h:B]04ZZAHdȀ9EJ:?P,Nyo-KfxHyC'8¤ɄxiR͙gZMyhlphuֿkT-7ff_ ٳϖ7FsڗVo\m(_cHC~,)p}BSNs+E#jjJ"DRrҪF[ 0h@6d:jm5hD:$X*d8 ֚ At!וּ-{YB mۇ?z Rtz4WAWAhh?!_}훰n߄uno`JqAS}I*Ǝc=HZDJX xy8­@Ž|@B]ޅY;&u޼;ClwXk{/_il7y6ͬDfM}op){f! 47_>0?2L? ׂњo]"# V FD0\탏Cs 8%ϵ#_u¬2b!E),8U6%R B6%h*,N 'L*a漞3F?x~+AroTfoM`$7ȇ2$I̯|]Jpi~$kȻ L[W?c=\ :(rm~|Ӈ `÷x~32WWLH8Au5˽`*хkW" 㺛*[ -GZ(_4PƪDpQCT >a!"4+l= uM'VDĀ`P/AJ Ѱ0oyQzѕ5FPrAhJj;b 8Fei˘zpZ,8cy o7|rozNe6 oQVx/͚??GhD iS9eP z8; =iS(e9݀jt&ٍB[l?ǮZQ-$c4k/8H%T?v u[#x+R?G@['H&?Jj"pˆ$O< 5^I`*OHX0L /::\ݕ#7qVy1lfC%5n JK@[.̈́l e{:6G]cݬ +㭺VvV䲻(TwG]U"ʒ7nrwzɓTB?zvx}݀WT5L 7ĸϸ^YJ^~_fWQiVUtY:\z6P@k/ߨ׭+k܃( )i$mѓBÞp xB5$[_opKf- q]J,ĮK*Yyl˻ɸ7lǕѶs&e7 f Q8(lz!NWW6=m 0@Eb )T( icJ'b0QbpJ`PN{/he ZYҜ$+n&m, qhj" ]Att+I*4%'QNa̦NV{H8 )N6cǕB$I^ )edR=uaUq,Pd^e+XB#Ϊ o0ÙUb1CB%!U)D*%=xd";D0yM-<0 F?ɓyԒr\ .%a,,aS*S"s}J]_&_Ajb;*#&Dr(m_SQ $1]wRkRx 9ft*FG6}]R="f]% 2K "mC|ֳDnb EvE<(Qx[ @b؟R4s8iHg eQƣ.`nj]V/8&%꟫ɗ>v^C}b+zN3b:(NT/P*SĘvz;r._/78Uԝ(Ww+Ѻsq KIYZoǛJ+FJ4sO)|3: y&?v#nQw -]. IګGWvtcks1רO|iO UQwLy5U&+_A6Wwgk۟k8.9/?VI'-y|f16olbFd^Kر`0B Z sJOh*<ݴ5FQZR'ז{T5$7㣪+I](x;s]I`cV)֦Y(I 01pF2zRTJf/6>^Lث*%/t$7 wȷQ8$olWՇY e[7 F`D{anKX¶)1N2qI9 Ҥ3ϬrHdphø5lvZD9݊ԒN$T *3J$!贐xMBؚ,MMiy0 ޭ*1Se!Y\شwa 1ViU1L©ҕ)s|v_K^Wz|qv|yWJjuɛSnhF-^M A>N"y$$z!*8‹Q=/ٷ_(96ռJRD ~`ORcI4s#Rsa,_/@36TSE}F/x.z\ IbΌ>ʎa^IӺ] D7"!^Q.J-%DDZ't"N B1rILSB8"u_FҽnjЗF=wym%IXZJƱP&[Pt-Q%Q˪+);$7U|7($weTw@UX ] )V.$mF)PdҔQbRxl@j8`{WU  Xft '-ꠤkNK;t~?(Up}pUTBU@"@lݘhUH u~Me6z1URT&LA@g$ Q/bߌd n3A;T]#шve/ @ 7 yHZ[VEn>rv|ߙa_6lO˫-/ז?{Fr`SE&AM^06gdICFAjRVIWlfXnu:ߩs:ղ5h[lsk%T[nY1C#CTaF:a~U]tG > : ejB`JCrVag~8%3!KOU g?z ^mu>k(-]qHDO|BH)Qө1 90TZeRE ҋ';T1/np*%9U,|opE J#P^v0r:8Br ŃD."Wp%$!RffTJ8θrd9',h G_`aBqI`z&_k/C *AF/4/&=!l(7"=y^ >p^RPD}@JPZoҒ{\1Á"d{wxR2e`젧i4 _y!K=ܑ,Pf6d&K%>s̶[-LjCl\_1a9X>9^.t0_3?_IG7'}xvS MxBNLhZO7,}zrX1ȰV*`9o(OOLӧ'y tN8}Eߙ˗]C?Pf|߯N|Q9u*\0|hҰx\H`[rIerI-@jq,u ؖBGkRojy~7nY=/2!^h7Yd?<%~w{t>@ CSuZVW3_Zޭ\jZܐٴjjKz6rAm5ʫZ%٩C,ZKjȒLdtNQ`\PXa,ZcLQgskH۱vOCđmB֨6!`Y#ԜT,|TANvуcb aͤ]ǒˆc3w*쬃P ܡy ,)g~B]IiRڸ<~m. IB{[ Q׻tuC*٪Fxv?.7jso O">$"{_- #s!ҒlKgli:w-+2+C][M}>pnvS9ፙIJ~SMuGGhV\$qn˩+ {&fp;F#H@RA (TS%/%qG)JTVJ4̀QGiUmYUIFIYaHS3R#U>P*>>MK[U@P^e6 .QUUNR*z23JX'f )ږFT}wѢv,`kFU,ԥt pR"R"wo^pcW(wQm@d$#%yfz(PE:N%Z;0:fhiEԼn>ZP/"{B}#e#Cs caD^ PG48U:+CGN^c@Npc$t/k#s ZMF*g4|jO- 39TSaWDceDk݇SƬvopӣoJ87Բ `z@7R *{ OQ &oq^mRvcwRK9KZ:gs@s?.Mɮ~[c֥__m( a~+II}ܦ(LxMQ+(R SjյA\!h,RȠ N*|Fjh뛾蒄0 ӁraYR}9 *51EC 2хe@EPz.uÆnFwS\נb2@OV ^_1teꬁvf{6NHFq(G%s$ IJ+50KhIu%PU$8RMׁxǡv,>̹x=?Sbܩެw*_)4L#%-#;#{*\;d:GJD`f-YpM$0˃$(x=轳A/yKk*4yAZE:+uYe-YpQePf9 AT}J݇(^"s6r 1OEF8™SjRHKe@}gC*i-1E(c 4yE[' 4TD)4Du7 g&&} P%P6ߧ` 3">)z} w 3RBĝlT=R rƚ")xlbmFUoG ( P-Ot. MJʎn/9UǍi׀nUݻ?#ή0 6:|AԔeی%^F$^g c[c[f Qޠ_-fx?W|K;HMEL17$˿M-i!@"˻re}׫G$ j×O_s9w{9Nܭ!y/e%V~.Hʣ$K؃%߮5bH =>fP ĨZ8 "8p Cy"ju.jӋXE`5(}ᵭ52?^kCGO\/}mA3]: Lգ5>?SAލj ^lӍJ8-YnF 9fZ7+6 zb<ڵzX[OۊOo6BO7_6ɞ}zr1|VR! ᅥ1>eQl0B~_je}h /xP06xD"7~56 [_9;7v܇1ѿQ>qm"|rQoLg&dS5Ļn)jǤ]_ {祒`aΖpB`[Mfk@Çln@ N[v}ۧ;aP ^kDO݊pS-|LU~}g))yۙ5\ӎл¤3Պ.BQmjC16<Z0*ap{]K] ^4 qzfܯ۫L۰Xğ>2̓ifw[.ӨpՏZofcdGU6i*n>c@>_ӿ x8{k{=[يR2F6_u ELբ=DAAFv>+aZ8v_敩݆\D{TW6ǹݔYn=ye{~v(Rn s}$2"R=2 D33 ~~sS*K("h_ڪhX0"01%k9Pt;դ| ;yZWN&v6ڤ6>?\ݯ>V w^ )v/vu}t׻O_'Yʺ}MKƴiL(4,H0cb$XRR bm>)kui J$Ie/ܫbA}Gln"DP*7Բy# .BM2 yH"/1KT܂:t:dn-¤ d3]rNa%vgbvR]:eBߥSf=%cX(_4zS6fi)(=VLNG'B]߬LٯzIH͓9_Ip'_Oh R06uGe&rA4t rQHg,DN q+ G®ƢʍU>K&4O ;4q] eןOIC=?;8뇔PONxcbpѤl<6IgWW-'7kdZnRmϢtj?QőwhM<:/݇_vg2m=V7 8se/A$h6+@`(iD-EY(h$thw%?n̛)ca2}Bsmj$[Ň12^j edQ0A ZcJGM{P pFRˠĸ<2`hnXbmL3gUV΋TnT[ySB.D%N !9q6`1E5EE^މ?M%M/4d7\i[L`g%v6 PIH.7u!e|i-,u-sX->Nfv۫`eov|§6Ňy.n&-Dv X?Aovr~wX=xpz㬓~URk]گ--|㯁7I ര)߂ IPrsmc&BO3]h$L7q[Ӱ(hQ|공ozGLĠ@„#9E=`!bnf?/ls|>~/ ;7@[8ZymjQ5~U"N{޽kQEn0#>'l3G` ty1ع{/ZisOy\عX񢣆:x(}V2B&_QӬ6CC]8hxȶn|#NT)۠68%3Y|R4/_Tp\Of{t1`~o/Z3H+/5?ԍd .+^Hѓ.35QBIM0#U]o%ԝz8c\N7astz :SPX@IϨȄ*[H/Y_}1O+z=dYNnN ?kLV*xHt+RcCKPxvu濮(JSY/?ٻyXv~_=ƻ1E.zwUA>fE ;I=uȘa97[-ЊӂAbyQ]PGn~-zj|MmzƷoFWokek-D=kd(ױٮHSB2`3^x\[k iF%Dr66^ 6:vQrI:iOS QHUX8A0m"*r-U!j@vgJ-yX7xk-Q}nz'c)|!ODхr Pց",Py>EXmN=e6rJM$2R7w˹*ȋ>g@3Gnuj,R0z)HSx^xՊ%dM>ņOp 2#,a&U!Y䗻߉/%Rѧ7j a"\fi7G p,1MA_nMUk픎 FOI+ 2H9kDZ_cq ]:{AP'% Z`'蜣ȅD~)%Nm#7Z@I;CH9%ƞ|6\m?Օ V7aWWdtS\OLq!]$- y_y8W@{ & Zlxa9TuKY>[s׆9cFv)Tk@=gfXEŘ VgO6Jc0XIm<؎X$x@I8"JF/AKH4]8,n鼫+sTX23'f8}'Mi5L`H+wI4< ~fQf[e-sTRkJ` w`2 +o 2T` FyR-CT_oS`k_X鳳RLRQy5:1p.CV,sn;'H\( N!1`ӚiBHERSG,3e2NBp.̻٧[Y}2Oz'Ihflt%!}鍅ݘ/ -M4Fp;XK1r\P*˟jɕNK}BeM.1ΆelpSߥ?猥Nrnւܛ|>u:-"[砓.ޗ>HMO"f|H,\訲PCHF9-v\x Avb潺|bp=>zh= kߝyqsR%UN})B= D F[xbgSY^8΍0p@ XUX#e9^8 Nw]O^+έkÜdkK+MoMtbW$/>6oשY$~;Sb5fB0r.

"Jf gٙ+<|Pʕf ۠r:b8MP%itWMHpOIzMkCijdJ 4.}&"IWۏt='\ӒuD6@*u˫njƽck>,gSabL~"6,s04& )v7wkZ W'J,'~q?͖27RaXYfes ʙA͕¬r߇U9O{Ⱦ(lCVYMoGWeyr". N| $EyKPqO_/˧>AP:PFA_Db:9ndT{' 3iԇDW.ڞ@BF'(B(:zYсZ)/c'dNkHSf%IvG6md5-EQV;ECm*eZpc  dUW1""DHcH"- ^y?J)Vyg=0]X|PhFud[;>f[E.)xeԁ=]_Z.q|Rw{$\LQ@]3gޠԠ7VYsX|PF"g}΢O^IU%T *{B<^W RbtIٞ1WaHjBpJЈd~bLA~i QR ";'(p%Z k(Żalv(;"$݁_57q8Zwon>x3]~əRa"K#C/ Xsypxvr̤Dv$gh8J:H/n"Qm!rJD'n9`q EXoS}b śSx.CPG㭥-t@N;}~jyZMFG(u4*pfokgث6]<1t?voMHh-à^r@ XX<^9D6!K9YjX)uvd_9z -ׁɐoz'%W$)ʀZ[_k#nN5RV"P֕Tk&:Nov9wziq{S͉W6 *H8pnuQ}C,Ǡ%oF{"wS2ď!l+WKV!b-K^Ǜ[(Bljԩ G{ 4q@/1ϭӿrlm`eERoG7V6~4snmeb:mQǻ.:8wk0ݺ?)E>|(Ab"‰~(5vB&Z)xMp&ucK$nUYd4D빛ncp}(V8k6ÛW_tb_* Gf'2$vi&@.p68)WX|@)р6\Y@tɹ 9˷F5*~Ja.pϹLR*KCT_oSuhLY)uyRӬ$>k+*JJ byxr6;v>޸ͺL=Ct#ew$J eEsQ[b=w3L9VEzӺdw$ TS|N1@qn;-K%xrG.$%Cs1+ܼ xա7kъFŹZ1}Ah&HP:UC:ghqILfZX>IÞTM-Q01eʌW!~CKuf)W 3r.10/Pxp$IwyF 8$aS l|[EYd˵mF4Z[wD)wyMZwa7\X$oDM|`t[ I@Saãe> -+#GE;j^AF4O]`{40xڲuؒ|t/oJR%Ĭ,R̸ _y60g`졿ʓ~Nu'dB5^R;gD?ʅ:E`#(/ZԾ\O}6ԟ+ #ŷFwR?on^L U1BHWt_Ց<~}v|"Q3Ob]}+TOYbTrS>B)$PW4Lb/kFG37Y?d|e{&7$$mG2K&NJ*Oru*C'W^cg/ Jr amdśbsޚn;<&$Nɟ)FU9h+i!$KtnL-w<AcI3ZlR骺 ]b-x!i IF-Qemt% ^Xȋk'ᕑ?Axu#(5JmlE45^Kۧ{\`^zaLťŪ$IJmrk,Y\`݆b<*Ḙ_l$MmRXs|Y9v4ZvJH$f̱wF ^^?@ryS|Xe^`;7,gf'W]>4^B]x~89<'y*8Tj|wCOF}qßo`YE^c4iM#f!vv.i' `jnw_w¥]銊36h-rm#wHBz~k }[Aг&sYtyN?S&nxǏffrG\^KIF  fV T_$QW"" a{ mDG7* SJb}h)*',bU4A#CNZE؎j pF+rd1lDmBsM sJ`AIBu'7mK"s:F̕|}V咞IhV!DŸ.헫׷m9׷MȬȘ*EfF~\bnu) +kSڅP">sa4G =t}YtRKP.˫\.6_g_U?W}?{=B8>w@њ9D ޓT<IPKhdܢ8J,:U~~_g?pj7[Y9Gc_ug(7߿\wйz׻/I9yyM^PgY:Û.xj$@`r2SyleR܇~N5oase# x o)׏_Dͳǫz#joaOi="4뷯/2)t~\ (r9įj5y vv_D/.]?7u/ /<<ƹG'op$ m5"fxVG*gg&ViQaG |]6KQ@dEŦe +ԟ<׺;Yw!i30̷yld6t}|M;ּcE8G?4D#@^7E!Z޸\wp:eZ_&L fF qNœ2 ޑhMg(AG:#e  R'Cj!Zy'(7E@|i0O0u.s֙z?Aa*}`^'99P]tVkyr'C MPGnג$ SؗEჽ\wJC)_6jkUa6Y2 IAN6GŅQ֞\LFJ5rmZ@4\V$셕>D5J#Y6]'9~ST/8.:c)DiJ>a! !Xf^gtqИ!L➥kj.$A0Iv>:{qca"h$YSgB:Y?|xʻa͔A][?/ξcEVu))v vu T_߷sT{G 3RVI!5>fMmTFFNK~W}ktcT?a&P&ػ8B,陻\hލ⾟bFBd򨧝 +&|mnG0q"[1eԩ$[t"ّ;AbRD2v"(]jZw,[岄9[HrX  K4e V\Od$RH8#['̜ c}\XNK~ %S6c.@ keBu8U6 Ն!TW BZ%j 8i dJ FR[Fx@F΃Դ,||{@V<=h ґ2&`Ed}]Em2R1xļq-+3-@@{*R2jbz2Inc=?PW; Kj AXP[;P&# 䵹ª<j HY%I+Iޔ[KI{R.=l.vs|+QxNZwE>䎼(RB,E۹6ron}iUMht@ggxi𚔹?bnXUvU~pi;š-7QQ'5f@5jgg.d>,{3igi"5D G9C+p~~uy\qα&1G3H\?{WǍkHlzmvW_$oZ32+c>Hyŷ7]^$ ܚPJ.ѺA j(KgR0q 0FX.sOC>JzttKSD#_8#QK`cZ_]7FƔhԄp,uLYLj7 AknecݑRR8}Žc+3,E\zc]4EsCܫEB\KJ McJ f1^Ч}o!tQcQP/F(ݙUS !Dlx՗Uo}?hwL"Q'n^+>3^nz]q6u=im7L:]-&쯩mŵ&tpܶ=h(WQuWyvWW@H<.$/TbNW5N' qLRl $`tlRpK0&8?l )gzoO~@O =Co Ci0pF"d/<Ǯ^io0~N>+spQ!cc=/h>ƪ;:H.c{rlX'/@t2œuj;N-k'}jPbr+ ,#Ǻ}/Njp nDkۑ.-rpO?ˆLj1~LK9Xr< xBO.]ˁ~~6Gf*,lصeֱV6>u3N:`|v9nIpkƨ,|vʑA%¸dɴ,䂚AWfc*.Ї.&5OLrvvF-+/3)5ZGyEd՗0g -dWr9IhZ| 2=GpvF)ol K~ FCmJL5IOR^{ڝ_ė9|qpƢq]v|tAl 7Ɨ3yyxh`x4HE5ݚ9YSFQL!5%SsZ-y0d(!I] & m#[6~N}eqsmeZE9s~Is.hgV(XmX32w*)ɝP0Gjurح2gYiIV(X-7gѬ``*6w.]X"My9e&tb(v,8/*F8WXjP^KK+)qs!؎]pl;ɜ{a ~V1#n^34|Ă7I}[Ҭw%7x^QޒYZ J#JWX.^T<4ޝAWi>tC"ط=3M|}1Ha9f(41~9r#z"t?/n+g)x4j\.aF5^*Cmw[]56,mw?@[U?VU}9^E*$W_;7%@x`J Bz{0=r#1mx NA5VK%vgSNDŤn&iamDr dRSIiB?mx$e3qܕtQ[P@#26ǓLj[&C1e n-1>˔9\~ƿRuQw0;d"Zfn*KLlYbTvRǗKmh|a( FBbka!jbPJ(bv7t$˪,o?Wr% :C\dU ; p,9PIbw%;%!f>?_̭L1+v0.pIXpuN͹r,>糱gM 6ПB?e36݂ wyjxY <u4sLN+vfيevu$;6l?$]`\/0?pQ<\`T}^.0Z5VMSҤNm|~~hMאNP[ - qluɞW- 3t0n#5NȲ&Cb{{&C]j[.idH^b~>&Cͧ~/:!]ʞCív 0thՅݙU'2P^ORZEid@1Q:ސŻ+1`I=,3bEz Sn ڭn+!t#>[FqrBpۂip)u m5'rL:,v0.e1ƌ8CpB_fN5󱆆0z6^`F4/!`Ƃ':v @йJC`N~?jA6/O| }Fyn k= :D2Lvhr' 3v'riÐ|of1fLPcN?+>4S*Ҝku4omut4:zh a9`Rn<'<)2|]{Or ch;6ђ6:3SD1>Tric<_yǮΌkwtJd7YǖՊ#jF83Rgh`'8cYLS ΃ZGJ\уC/6ytQ\;JcsQd6fh+hrv}%uCtD݆p-j^PK 5Vh~f;׮i.8'N˨ͽ7򍽥f)K$d6+),4uZ8E \jfDy\t;IH o;|a-c,}dCUAum AR|} LV. k,)cb^lj.znw*I i'jW~Ѽ.WǙ)H6-}v-kHWiI{w$:+➾'}o;Ls: 붲*Wέͺmk {yR_%DݎJ$ 0f_Ldx 0e]dB.0pryTlz> 8ɫDƑ9?(㛳(8e9h11FOC ^{qШ`ohkbm4Iة iTm pVHcLb4"=d؀k|MTs D FkHK :xr^ 2[l#ĸO !sr[ F!9)rXM|͹a XU?1 +F09ؕ_+wg#>}U3WЀ,G0F8x[Do'IʂMͱ}ď?r'\yoIq3{4MKkCՁ_Oj%u)õ{1~xԔVA~k;׮ j]{A3gz}ȶdSH32UX""PaR ڣ;#ѩ)L_BΙPR1DFly }8Laإo&c / $AҡPs!$e~mX*Hz2=g@R#~Hʙ5qΔ~,`U 5v<;%81D҈ )%?]CX.L8o$y/X2gzW 9xvbN<c׊&u:._"xdo~@}N%pXyM8+Ui/)A(T)ɕ$ u/ $_V:yXsr9X Mw-G;9B9Sgl(%N67)3%)5/71L4ڝ0;8ClB:әFxK2]&\NH;JBr=_NpNIɀLu)(G_ӝХԈVt!5;D |͟PNQ!$tqwY(UtaCUϣg?ANDfdπ)L鈽;ϝnyI$QràF$gV1#+c)+fK˄ł1QQZy)G$?lF=KLcFKe,)JD@ axeFquńBYɰDh+5k(٬nI`&'̈=xrܿ'v(%@e)՜sE6\sm\ÎqA)HmYc2PypmPɘT ^[GZuFK6A5Dq[{G3y5Lj YT! HEe^9CTJ\(&@fLsD%{im>2?m lV5g=2m?z9:nf;l1E&ݧoEFQ!-'g*s=d ifuuȯrSjʿC^+=7g@Qm@, J6%b9X>nK#QL0'esrm(a1qՔ3C 4AA( S@4be.r-q@a4Նq!:|fLuCs\Bڱ6T(0)Ѩȑ%PN:jhtE5f4`#S-/2B`UȂV 9`\U :j.if?^ka@VRWEQ*Ua zGi Հ-aVQ AGe{ѥ/ /ġM67ÿ#؏ϑ4R1NGz͍H2Ͻ;d&7/1R88Zy*?!bTGYE' ةb>K93E,8g) $ ta& 8RjWl6R$?$YHT 5a(oBt<.g(5_  hN,8E1Yyd⑖t5 #%]c$!]]B&0=uS8}ֽAx|7fATkSB{ŷ?qNz j% M%Ի $tL}? ih#2rDbrz׉z1n'"rR»:1=%v.+`;=&>&e~ DC|3IJ藢#|+LHR1ʸͧ'C8QوI#Nt,])'zhHZ, ]Ζ hzg. Yg k4C6rM\K%2(̧ H_'lYܯ*mVS 6hlnPD%ж0׏_޺*7w?ml2%[ENj8 pcGY=jcno`͋z(اOܒXv XP#b?}\*F1 ,H Jo9p#y! f= s3G߇]G&}|OYiF˲<*u5ZK &9G+MT0íEN|OGV4nVԹDke-EAuQTX*ʂtP)_Ž3ҏ-1xv&q۩T wGN>!@Y;8!A0kuҝ۬㯡9RkOkk ,ޮ>;>8+z#YFe(c Y.H9sefuuȯr<j=žPx xEzcYGU앉9{bG`|a(2TFPùʡ9 @0)=`b\4l63yR=ʈPT @1,`GLI"d8&-,lfJ2E+)D*A fF/ ʇ:xХ9rE8E wn5O-IAch_hd}絯ZN s<}Fԛ uׇ6Wt[LP&'+ H%&* NSH؂|ġ׍ua$LRM& H_մjӂq5R=䬦&TqUd@\ \t.IJhY1_z1c9bI!O0lA48g$vs|dܫ)c2 &1*(giynK* `s*•1VT Q9G1ط^4oyAFcd!uּTМrDeahY2̉L*%C* v{{)vi ?LNcI:N1]AyTs5Ǿua< bixqJ4VԷ}2FDs*T cm2DAa:s#[ݘv}wm {'Q>uet^s0Ņ\`I>o6A%RbJfi #*-UjIbtϾ}ǃ ĆVC1C; ׷G{LngV.V~0"RizaG1#lxSq"L2 thP1VbB>δ@A}m3ɿiI!NQ5'U\b7PczPS*Wtlz7֣'0̩ZZ;.7@x٬+ :*|J[y3bU9lZjVՇ*3gzqFh5?;^7?+Gn2Zj~Qꁭڋe}\( Er'sG\Xļ}N†, ~"B}Yh;ؐlaKպpWRVRqz^OBͪ/7drK=!$c-[aZ6'v+rOm"÷ɥ1,P!!_Ȕ8v|td)mHmhdmPۭ E&S:Ph mAZzNzqj/=O]^Qädywsy)_M'Wޣ䅺Z6yn&X**΄TPmE]hol! y?m׍$ Gzi)ZM&T%|bXcXjǫ  [*;Dla3C佡i to`xXH":e~ML *Y1dJV /Y1C %^^lzōy>kp>q9wpwR>qD5gp3w6{i_/t }`'wCq2B\ 9!IK}/;ڇ]#A}k3 } ZO7,ٵs$29Cu+Ey&mrQ'3)R>.9@nm Pj#5v9~Ts6?-6E-2=eY5"z{5d {yw~jSR=ǧ7B8%Tok?٪@Y8ח[V?%W.Wyo}YdO3Ȳ%i||/l/aUX3:f)E!H,rCKpBOźbpq?TKZ >W(vJ%($UTЀفboLRТ)Z-U]mn+Ė QMZȧήlr&ѐH/[Ux=M+ {FY\^m~%DizC`Cqt?~% hPtM}_ ZCng duY SGj"aϋmXopa0 Pm>'A%tyaځlV <l@tǽ϶~ n# 7WQR.vI5Udo8oH[dA$N2uvAeVP_/=4wꊂ^keUۗ&bi/G@[ Χ=-|!B9f(eo"<(5{|=W-U3vtɲClR#qYqqT888GPgzEj'-˲(,"s4X@,x5,U%, &A1e?aϓi9OȏOn> =9SGCj ݝIpʷsM7t_^]nH*"\h|*K?Η1DݩCvO2Ι&S.~l.d$ul$;p/9N(Uܙ;yǺ_+Wj!;zQAGp(8ÛK}8lo(;ygFbPgVG4wͺً^ZI'krQzрOcur_F|zy*'d [DfIbL3.D;9 $E3+.~@q}?)l]}ŪS'G VUweE&4Pr(^5 ֲ&)Av`aIR4NhBqd8bL*29|Ծ&) YXw=d" !hjGm" ozk~r,jnE+}< qW5AT&hB%nҊKz5^Kq:0@ǍA{qZ\1Bz]GT]*mSIph}'H͋gQs`[a?|wE $X1'6#b29QQ"-Q!.'䇻<J:Ƹ?⮎!b?ԉR [ ]eDsfc͵OwTF:n={!9HsBZkRF8Z|u8QTWR$RiG= @OIC.s&gZi&e1 u[ImBFP8hv2(,8ByXmF'Aά,Ӕ(h5FRSN[]ъ]ܭʵx1R@u2zaz'fkrp?gmRg%K/5qh(k!!wo$!GcD Qc^ 0YiN?Tg֛gw"g׌ոjV3!sfN/fPLxL`!d>w S4c=:L)hC|x?2-h:\c(ܕi|ƴ)J-ׯK b߱DvܢMr0ZcZ9SﭲVe]9 o9ғ3?e2֒|"%Sz; AhHN;iY_ʉsZ/\DdJ |4SCn<w4nnEm3w6']iq0{;ޱ'X_L 1̛9k H͎jLo.8m@6\Gxwyηea(yx1Q'%t9B4;_kvVv^( 9owax-J%mPZ&vj1۷k^l#K Zøxb&m>OjNqKt>:*ϧP/5 ٰso9>%\ %Hw-˕]o!$M!,avIKe:cYDj-Čn[FcqeI:Լ3ۢIrQeZSvhRT:t-UCVQ6㘴 ^t!UC؇B(U/j5Z>h:^8sP䥽 P޹?{wM-->BL(6b7OunYBHq[J2rR\QhÓ+d]ҤY6 ą"1Wo.DDhRTEx;b@d2~,NEJrۃ˿ba#)'t87ݠ|ZcE)/y 66\v:w/@VRG#a!~OWn[R J#&7(gEJ/ڗ7HO"HoW,`e ќ.Vl)RY ׸i2do#3H^lJL:2U-beag beӠƲъ/[o2-џ>|A8CYKNW`dhCHT'Mai $, i${Ƹ \q t(V!z(c-=6GaEI '768PË* AzdL3Peb9eaZZS噰6̧P#CDvmG)eX4d7eD:딡 g>CY+DZ<5!B^!X2)hC=Mv+0< q9c#eOA Z+I!jK0߄[pD3Kzbu Af4XH,X-T+>Z0#\(i |#qxn4[`^mc>t"LIey1u)xpJ72KӌKO cd#kq*UX˴2d-po[ف6wILwhl$qx;]k#?%w]j|0")͙_Gq0?wYyn,Q:sJJ!ݓ±5DK0̸dlĖ%AmXO|x,B El-A)E!Z,`,A9l~bV뽌^?iESgÎag Z] `3=g/9Hp8(,#߽@vVkOΠ\k}abeϥB{{7y:'8v3je9E<5gC6;(&d;AR㨮xvV^a?kMiR:B BSJST_R)݊g.@X%ܤ )$lx0t]&k>yثGjl]1j0L-~kVG VvL@3B[N$ aeIA{䵗]̮w&U,F;X D9`V_DX+>gEg#ނp:]%Kujd!0C3gO c-iqFo1m"M1xp۱Ht ^O^v}%^ ;7"q0P85Q*9 ļ &M{Rj$3Be{hq̓o9ܢ 33J ¸Ƃ>}Q$8(5l<*N+)%}HKV8o'P$ZJ\H[=C91"HNxO/ 1I"TP9"SpuHN!DhE,1f%5^eR{5>%FzEUsf^AӸw(6+@x~ |\4ƚ 8R6$Z NF#QCkZ@:c{ ߃__8`O# {"kmz3#trU|z;YW]eY29m\vp)d!2'XpQeT{O{43"I 4. -I~s-Q;1js+%7SgV,9i I#h)22ỏyQ'V^ Z9L eT`z04*07 ˰ 4 67wHGPWXͼ\`c5X[Xx>8@xM=X)”fga`"Ԡ:l Ÿrh=Kp.M9f\Jp~uR)ʙTv)y@mK"e]RXN3K@?8B.)PZ;%N舅ˎK%eH%գ.)0^+ r$&Z {>>kr/k|wi`8NTpeמVcA3 `D2Z7jvNWe xC2tb;շl VJU80R"N"j2%I3$PXiDa]pk'U"T(߲W.dH*`6.̉G2ҁK!R+X֘ź`K¥Mif.Ep%9>TɥFҗ2>Ļ.E5|FGBrUrpߚBM ,&"'hZ{z,爈X23>s .O^, ѝģn$yO)xOkzY˼dĤI.5!Κ=f< lI.^z|B+ܢut^AwG8gv vTpʙKڍg{ў߉_I $|y/8.Tlq}t_ӛnqnQu[H_jʐh6 {|#)P׌iQ %m#ZߩzНgOOyϏ$:K,i -JT٨=j;'Rt.T<tsBuЄ>tsBt=tsBvn(^~VM>}ѥJg*2:W{s%@?ً0t8i09TT֫u"Ool^ba`ܗqk5Em.u\:tfʜ,.sZnK'j?|VtY.u*HStVCkZk5E#{>^b H/#Dc ),æ%SQBs8cylG"2e_{LA60nК~:[M ƹe cugWgr<{),~J I%h޸k3%*iPԩ/nK"U#EI;7^hlZENSAw0}ęr4.myGru, u$CƓ-v`[-%n@|N]pi֪ e|5E5J1 n*&&n V#Iq6qy5_k0lss<iL. ./̓ &zO/liFL I*' YǺ)p b-a}!1vrMW/9x ;z9 yeBSjܡƯiաUz@iuWct&7zԈr V.(_ԡE i.innTSx;/3(+D[vP0woX BH:\ϵ.5Zy়_zkŬrl;'TH~Z 0[Bk0{sQ݃Ƃ с Mxw\ g;jjԟWqvo [촋};httEAy'gM(vN郰;Y1?c00dIIpi9% 9,ֿ dD)U|໑2ڮ8n[K P-}D^?v(KKfO[IXe"!` AY[+*ei.;U9??T޲3hm r,Gr(mtLBZ!IF xJjb|j= H9yW(R|%/:?B[Y.fYYs?1盻ˀyRӺ vme;lVGHSOWM.ֲv.lwN<K X3>r8U/<;DclJVIǺnVzMѻbb:c4ngf$cݢ'ݺDlG&b111xs"r0=Pօ|&kSGhc n挷Ǜc?{\x_4F=Ƿon54 }o7c'Xb"7_(l=I}_c~u=x K ~dpP#ݲ7x;Ўh1TY$5,gB#fj(2*fovcͷR"ڒKW8i KJ L411!,@b%YMsǨM~H HKn)hl8#qи=T嗧=ܚ"e.aeXȜ"HW%.}|z_%0zY{$|;a]ln-JZ7\Y3MJH%٩JZd)QIԹTRuFDsD;NU9W*cs%[XR(\MLbZ*oνtϹt3 Qe/V1ŴGJٜkVdw;k۷RUIn ;OyZXOWGI~_E gz6SRW&͇?$?G 9k#"kxJ'Ě]D̷U*\,FiZ`߻4|o$#D gq#sIXl8N[$"ϩ Ь) Wo!9 sq^}4k窣v:b asW֫nZqX]x2d.ҕ]{>{o}jު9lRnp&yN&C9\c$7cCLJ}2ܰT?dDުEVf8Р)P5hл;0]0Cq$)IDB !pJ|m?- γ,CN<ÖK'ٌ\LžӦXRrjSDݔvIpH' gG2B2|*ZsRw²XyhVB&pQ9x==՛D0׸I 84tfՊ}Xd5c*+8G.@|+NJ՚/,-qfZT6Zmj=9%@LWI>^m=$Ἳ@ZtL94d C(csBDT;zɫ ܖqR*Z~;xFR(Ju%Ejh4pK7xVzLԘ*yҋRìqo×.5f0jh<0(/ ,#&(4,Fjw RuZ<ʩeiF9"fV1KsA"64` ?v`y^ZJERץƂu!aVJI&5[l++J>Rf5Np+UV&juJhJ2H-D4Yٮf"%I93S 1uƸLL -|k´@"e6@uӆ} &zϝp&?OKf3ؖʾDxRt;bQb!]|XtLcSܢڥ1i:\8IV" ֛.p“,&Iaq.qFH}5}bK+73C ȵ@`L\ 1j9 [ Bf5# 6!l7KhƱg%Zm{G{Q^=x_N8ja9ֱ8V~ipfJ J[i44\ !/H٤MW6FI/Z^1C'M|Ѫr,Xڿǩ^`MgCi)ȃ }-b( /&_}A bZ1,#XkEF9܅D*hj#)BrDº:Ej w'[uܛKf jw2?@JY`Ԙb}J1՜ Ԥ%owL4NVN4$va ND4,WvatDbCCjBdwEG9>og1 %=)B3\i/r MACa5z-2ysQ~$pXea&Ky"&q q(d65ԣ]Zc{|_j AQlu2 KAʦʈԥQ 10.38܃ v4pL0у}6@!>+A,oX`~݄S$;_32s%4ðm־,89)ZʈX S9S"DI~()47pH')EkeT xgbY{%\^ob0{zå s.]šRN,7+zNO=D-@>,M-ᜡK ^vW#F VYA}pXge!j[Nç4Hփ@%1'3\`b%b N̙\+JULlD.!NQ,+o"5u99Qe Ŕ%L&B$ixHH}sBvN`w8pρJTQd0 {q]Hڮ,Kdɐ@OH SP< +=&}]jM\k^@+E̓p6#%zl̈́cQ0Tα;4֋y;BqaEw&8">$ip9gQ҃LfoNY`ja++)p\1E3XOL|_ P|z/F鐻ֹ}; 4[hnRBy._ߟi\g kf #E$)1UL{m{y4 \deس0%L̟Rh._ i "d_n5a.Q_G}EںJn"Af;a06~{`ݨ닡*Ђ˓&WyKދXr EFַ[Edgԓ":=DtumD/\7& \0pz_ pGX 0uGmUR\zvPNz2,ƹ" 64=70ˑU?:]O{ )KÝvi"smr3R q0KJ:a8#qdJ6rzn>$u9kEg1fLQZJI}_z J/Jy`>,g.5{Pl9,eBʹuSFxFG[ք%2wy+%s+ ڤ,LJaޔ'_4,1p zS◬#3٧җ~^"\VxXB8j &g~kYYoǶڿE?m F( /- vv?~凕@2.f,EN~q+| ?7K[WFF߾uNwWmN޸*P@K^ua=/Rl?8OV2b{!%X_/0z~0_E/n'\> .̳a ޹}jMXJ63,#!axЯvP=ǕBzYZ/pn g/q9(sz+TPtӟ, J=+D KkbB,|+c&4JN:l[Q!_TJYU''FipMuY9D=v~+1o{b!t..Q#)WDmmolYe5QoTEF{5V$::vhCQPsHDGmu=}2",K*d(ʣt.E>ڀPq?|T3kV[1qDV"cSqƹ6E^?qʘUTA{͵[[a2LJ=',hAg $p2dzU\>=oqseO7#3T ͸뻛I@iϮƀXϋ/']qۤ\!ɆH$EfE;]6ڪo,vt]j-G/Vbh$baliٞP`ŴظdPE;6}# 1|1wSB2,ژN+w7:)𜆷mV)1 LǏ"5MTJhdEXڀZSͥ,[0zzqi5h:|Gͥ\l8CXNJaa&,f`'&զ|Rkh*h`kSLa8~0a&e&aÕILꅢHd;^a󺭉o] E>XQ'G#J8Ή6цji(+ 0%&텗DzÓ1lNXeylJqΐjXae)\FESa0WoDd"J7J"ZʼnƪDe``y\vY&x-B9)Sk09p+. ǏVsh$@sAT?y7B6@ZJ}vti}ջ71 BGbtt1 etؐVBkQ j=6R'6rgDk-zH8ƫk8EXȥJ2^;qq0X[XDbYq,q;*0Lb \P=013lexJ=Z׷zR C)N6sIDC( s꩒y8qޅCP_CMKG}L"u`HQpi DbjEQ)qH%"fVk9@6Kr%%ҤʙM R-#ø!ڸkBcv;2_L{oV%oZaaZUs&ڌ>*Ca!Ra똂( TW 6䷽WU]=r"L Kl%Q 2Y2iTBaS:;Vų3KnyP7h.ɻYZ&VRzn$pQdO,MFSmI=P;IQ@ӶÌNj*(2Ny& ;(Q8y%0Ɣ=Lq™z@Ͱ4\KfHlrx5Y* 2RI-X2|*R2vDV(mAUq k̯MPH%y(pg)`C f̦aJ(%sřC*c!KjEPGT띐;{u8+G]`U-~|fFyPn L &P=b"9RBJId0V4ͬCIhJlJ bH,UzC,x/|୅sD! 6X"-`I!Kd00nqک)RkB HAke8n+EE IIhK[*?%,AUj( Z1&(j@ %Xx}>GL`R Ts:r([3E&5*P-#IL$+`5==y<)/g;fETS9\Ez;N.n#B#Q1\r {]s$T /s5__.?e YYKh#tkJ8a#h{qxM5=nmpw΢xjSHtSfnMq:MQG!5%I) ҭ Yt"oI(KB-׳:W.fTd%vU6VRI%O詄J'J-U±$!?[6"I&S$ [HR9$`&DEHbC.ɁJE(qC p[w~h5^ Ev<`'3,G&y(f Iw\q$fs*Yy> DhTPˉ6{цv!` %lp$xfF5[Ó1߂j&B{ 1aZʨ&ӡS"u%:lt* , s5i C6  i*<).ҔLTz80h-ɮ8z(`ZsQj^ctD" )U>:`0~)jXytPqmr/eh$F d8X&D M*h%eE)B[O.uI.ŖFQnAZwp)\ `zF>}aq0Ut㭃~hRJlix̍$>a 8MTfenǩ@3îSȾ!6 |N ׊hIpC^a˥Y:IM1|Fmj^ElꤽPЮw7Eߦi~}^qR_ y5[7,r!&R/ z,g^E M=Vv:oC3:m8K:I҉rYj$I|O]\#`.X !VG*8۪"=npw6D;-1Dz@}8=^CߣzaL#4= b@E)^@&Hš2^9JAZ{VƁ:6ng iM` YZJa6B gMfaP QGvl,w݁5&8Ԓ=JnAY3͐2Xʬ0p@ FHJ)Gkh븚v#?{Wɍc^8;(ɸ٧8Yxώ=I.KU/[*UW?*H"w(K >?tOHQa@Q_h>CxƓw,??Kb6yApvӗd/nx{.ύ&Bmlμ|$gx?Sdg:Ӿ w}7w6ww أ{YhR##\9\QŅ\ ^ 5߄/WN[̻)D% ) CR c_݈`c@ )g ) y0Et2!183(`nL6%Y_٫؋W//bpj2mu: 2Κ@MQ4sQ1Zά7sIh%aFPj*G Ut=bOU&Ū;XF$c褹L.! y0V@(Lczuji 3bdy.A(`F'l{踻2 3VoyTyV!N{z.">5Lu5(: pWP 7YaRhDqY+tbb:(}Ʒ,j|˪"&+1G9=8X.LQO4xOf%G'&jy6?^tQ0V}RXFv'"2(F*f.hUVJ@cƅ>Fke%!Vtj[@z ѭfA&a-3Z![z{* n7ϗ,'u500LFzz= 4d=;bE($JWu5`Rж^/1QhIKꨈՂZ\Z eV$"~qhQ@0*$"3 `n}BeQr yԢ5mm\`c8rbE9oW\x8@71lOo BUh6m>΢,:uKz m@Cc`Ah@Ym!F=uY[߫(1p9-JCrM|и*Z"*őխmK \\CodB#Q7 2龱MӭlRoۖIjîBfr fsSIp?_ULK_*1^XOnwh2˛Kݏ 3}]XЏwܿHܿHܿHܿs*5}Z҂l|J I`'?#f8vD 5Պyה@s5k_f/N2!Y0!>(5rm[Ì49dx+, SAN~twgtYڋ7x5dMexwa񦍵K& (6ozvaJ;o2T(0ceN߉Q0k3V uc޽/ (bK*<@6G4vܭb"T1r1JOֲ4(Չt"N|eD0|?EŅB-XtpIM⭿dZg(EYm)]{h$3J‘o$x?8R\ ]߶ \>p?POIV?x:E?"K$ÿII;Hy"E_?7DE7fihi#ӹpÙc F8m9 oi=p~??w͑:?[/)hk8RX1l|i Z.&kS-d f:[f\v9r!3(yK 2 T*E*ITp{kE0h[MmD״[!ME 5u0=-eB'e222A?xҐ9NK\\Ce,\>qDe8P`܎bw8ՊCX:%$^oG{lSN+t&.ѿ2_WQwX뛇jnO:h(??uw>ܴs{ʯSC ]5ԁK?q}Hx(8O/פ(=gB2ޅˆ.XMu~(ج\ӿ~w_Cꃾ<4n{\]!\Et YnFV˃թFu/#r>uJiАWU:N?޵n)XZN7J|G(M ;v_ֿ"3sTDꉍFpQ& h::*}lV)'1 Q>b4JX``4{>?lPhs,XXXXg}&?)peXnY˰^ ր 2ALA^2u[X`זL._.D[2 -J "3YҠ<V-kYf~Xo.M߽ztRCv@361dK  lUG ^h(yH&͔< $#>Cf  D,v@^2\7?6\2R:h 7KkY>^< 4VN>I$Z99~:G&6c, yl=tVxؙJN&p%vWgS wه-5@3v U%ݴYNlnzM;r)$;- ?V˃թFu1: *̺eZ14䕫hN~F;Mfn p zzr)'c$@a3* 䧤4vrTqS+?FGeS1!ƂGc i^bkӛ9ҰFHm"fD"&[a&bo"f,!Wȩ.P͒{W{8]5Y븪ɓzI>A )a @t[JQz,4^k)8ߓ^T~3N]^m~J\ ÇL1&xkc˗ŇMY(̯ͥrMn I?$ Owu7ZFѿٻ6r$Wzݝ( ;/tL(*diH(RrxXI7ѶH|8@f\nX};eYtK: fr X %b ov ˿I\{f,3/t:JcJaCqICrJAu^#[A3tQ_.5ΒO.}#^y-P(fP kUY@ȕK`K&9jM|:YPvY [3T_P]x>N"KRi'tVi+BJQ}ݤg)=a)$b)RE)}*۩nRMy"Rʍ< )FujR~ғRӤ -c/:MJ#ՆF5ʳJLRqW?>JLҊ8YJOXJwHil7VcO>Sw^|6h) 4C+%ٓa&UU"dRKִ <5,1,V`5K9K,"^%kXd Jpu$VsCnj&aRd'(&S\%mEyE\+WD6#ʞ^b.y|<S& ͔(2_HO̰ڰR+!Xa ZGpk+ RVJ6y\̛x#'3>{Q^! >!"/GdJ*ngmaJFWiˣEcB!B4"0]*ev =pV Yr^:eX !|0GXKQfF)c1Bɂˀυ(2,Djpy a@? 6EA(-u)G"D1y\ǐ [LEr8vG+)j򃿄䗰֋7i[ eRԗs2zy ۊB&zD8bif&-$No.@Z;qMV5L(Njָiuf3tQ?`T_P-:O .J=>c޾e(=|>( 4yOU"j,J!{~#bb\,wgBPF12QOJk|ܮy00sK *}*( p{wKi%EE&j0UGuIۺ+AsioҤCxR!MJyiKH-$iPqU2+A7sͭ  by&zO$355l1IX,Y-F&z$( tSν)*9?R!dï}bQcjqEю:uEmcfEtLfJwR'a\pW|R.|+Џh-bcO.iҌ @ƛivKz\׵q}9|蜬gog<0^;UJ!֨bDF}|wW[K;) G9f>f-eW ?_GƧ\e\cbFmj?: 72:G%>G|{xt)[jДvvM24bTkU'-1ؽ.XvA TzH,룢װ.Aw2jvNzKZ/ow/qn!:(:{}a v}9hߩ R^h5~{zs&I4tthocFe1kJD]ZWrYkZ9<0:(' ОSy\:CvI>γ29Y b:x@lbI!)U3YP YOLL$hB[ng^O,6ׅn3&gvܖO-XNk%LjZZ~Vɸ)S|̻1F9պf霸 @p[Q{$H5ipeJr\K\> JKQ\i1q@M, !4Ӝ`q̮ECzd/JA50v z]%{}^eq2}y4mb>^%o^=|zʯ M`cLdN~^qϠ{#3_H5%B{d $!O/{Rd9%~צ 9\ x V%%,-Ù‰L&杏&BKDՖ<7CP$j<7^1JO -K2"* cْ' ^nSM,qnwn?+-,WPwџs'￿(靈rvg&6$S O_<#gFOn//ϋO[:`?3//,˧OWغ ce jyy=~HuX̠ll>UÏ!vTS=i*&J) ` mif?>x[BҊT$R- VNve㆝2GT(+u5"8bQ:/O/遘N悖fNC@HLk:" ߮WS^S:_[me,E>X8ġ c.`H63G8"P9oamyakhAnjڼ( <0ƫ`|MSL=QSMArW>qӄd1\2.~Fܙ/s1CsϒZrTxx 󮲃 Ȭ.AY3m^(ϋ?ˑ]sEeu'j߱J8QmI#q71D7777u&/zȧwJ4gڑs%JBW:y)3啗9*T.~`cܘW}*_t軋\ev!ej4?cNQ/ c 6]&[>Sw^|6%ϴ/۩ַ˾vJ`!èv0 $"=5l -AߒXjӿ %N[f|A"HiMBV(B!r,]f@wT1 cvY80Z[ь9h? ϼً7B۞=гDaXm$Hu IF2wKV*R@9[Et%RAg m"OđW:^yhPj.eTNXmin4ZZ,CaAsfT+ ZT`\!=8(;'u@>>(-g gSD"b- &NVa/L peM;KCK/?Et5ӶEƩEtdRDG%0/48biS&- \9x;+T y69Z͵E߯[(^r*fbK䄤za*o\wVY2_Q[ o?C V$>!̹#D]}p1q:.EËV$*WBgq[8g%5Hq곛^Wc_=y j;Zi:25JpC^%$zN`:C(s6DD _v1$ŏ46" 4RM5pILz!Ͽvaf!oG3  ;-x}wp0p*CXlI‚v~,,(I7YPQLަ`҆=@L`o&yL`m8?C햴ZEĐraxz3-ggnp<[NU-ճmmS0Szؠ`<(fMՃ.TkΏS;H"Svty2WvNLWJ-}kZ;OkŗmX'X֕+` D7BZ gϷw$jG /7xǦ3&+05«6b|{]+&JǦJI=Vbɍ=gyƥ)2QJ9X3g}.<6j5_\zom5 1f~lr>WkTpXWාe?{WT_e@JC~}>cۄӟ,dpJg@iϸ`{c.`CcИ<4&L>xjkeGB^y4#ߝ莰cNt '#Zͣո-ceD+tc1]?%) *YF*+x1&xMイEFoN1? sgxy=F]wy1x]c AIV_ $~c9X1W0/9R˿$]p)9|^V~]Gqg٧yX8N uɖN&|23q_ ޖO?X̷c7̬wmr*:Z:gas+̳' _[Op]"}DhIf܃Væ7}IddLVS)$"Ym–lc,b,]j^;`on㯅wOb_pֿOs-XRdtƘʙRƷ+U>bu9Uq9l(V$:(H tdtdOB\miHgzz7: hd6M ӀQnT0R+2̗l>_ j R-da [S`O@Fؙ*g_GL/^p<&-Z`ڶ}{ bl1@V}3$RMdӒ<>u),pwֻA JЖ!$Z>b+_0R뱝)ڎ". RΔ< `ʟ"#?;u+%'/waOpTSbkY 6: 9hd}G~k҅C`5qPl䌿y7׎W"3Φ:>.!SQRm޿[ovb gWex qn,N}*HH;^8I[Ί{TKe!6lm/Fd*Xf4=QNĹ(q";gh&߄ JG*5!zCNxJh GȊ.ۗ/c7 +ls9im dYsa1pY R}7h*.6IkD8laSDXDkIGb+&ۑSJ۔aUz0NV Ba7 iE횰b:"j1ϊwun8/1]!cOiS9*m-{  |Y. ZKۏ]ɓѵr{ ; ݘץS.]ӡX"S#(eu &'Pҗl-oBP XN9F(Ř( -{29Rݏ덇:.cҩ۱n: M!UV]%9ÓyvFҲoFMť=)mχkp)$֪REm17Igza\oFMewg;^Tݴzcz0wwt`[KGFDJ_?z w ' M'i2dȲ ȕ*m}X-|ՕP {gR}<qbEkŸbaBzN&Fm`CEKWȰ9-b'ĿF؋@K͔c yiSnZQj{_5ZP^0줉R;eMW!k2x tSh 2jD,cgJ]ͳ)<*:wuV d9'|bqapM#'<6n>,Ǫf]S2y Ԝߴ(9NWG:&Mn}aܛ{C ֊yvҐ U[~2TdINg[ǂ[WR7H9Au'aX'8Wb=L Ymx3 7=xs3XH+el Vb0d4~*N%뽹n e_.e@3k")yqRjB!|ʺNrpmJXɰ4z;kh 3sF'A;f'B>d'3woO4هu<3BBG5ػ^n"~ Η[ո9$}<{UtQtJ.VG9tVKd}L'ődcpϛke4#!^ AJ @󔪴f6\>tەEibT.n:"iY1-ۈ줉R8M؛2wݑl]z5 ^crC8͇km}==x\AY]Rv.& ?@kda1o^[]Ja;2=,놲2%4R_e:vL^FcU[r۔6fd$br{7<+wVjͨH>w!ҾW͞sbTE8?sWV_{{:|t#]ygz_ExƷZ)ZWr'SvEj%*B&끢s92*"eG_ ljKgs&b:F+a<۹ߔ;ݹh ŇrXr6R&i y( X/슼%g6Ǹ|v#^KH5Ҳ u5sjE =yLs PYMFVszy67<g}bL{o:xR@REH, #M/cw/&䑩h+۵C"i?H Yo ~fQR/ŕP$:_:B(]f^ȤmhU R( {yE R zұw~gHzC^Qg6|g%Y%!wRJrlbR@zZ)f=MY]ߘmdc }!¡O."bNoaTٳa08+1$:j"|kU%.x31minfAҁHpmbMI:]a,mW)Rz4b"9{t)=hEfX--+pɴ-՜>eGA+ 180ܼ!*nfzu!m܄{<cCWK`@ODZ=b=fw2%^L,FMs!?taǣS)pNi ̠䲋GCB:ʈXA@ }^ji(?t xD+ER@qй0/H<9[u%s갞*Ѭ^Nͼ  \b◓횶kK*'ippȊQLak p_a\ <ܺb"N2'@߾!N[vw1 \: %; rY&ohdyuؚApïh[`uZ. tZƹ+Q˶Kk8t@Q3X'?|<}꙼)_txP^] _ǽ퍵a:-KgWq`I?ނ &X|,FٶdnNoxFo(GZ#Lj\a0L!Z8QrG.$ O,O J18ح8k8Cll+jl/YȧcVVi/NGq?S$1}=72+9^-Хr| Ť|CYrLRDOdUC=tA3LrBRV3\xHu8b)38yo;ij"U7i!=_eZx߽Yôb-9/Tn(85Nzʂ5$VJLW˔;W`mo!:$s%T2Ǯ?|:iF"}~>\ RWIaU.cG`,gZGsOtJ>X%c&=myk$c" Jt"eG3ܗFmhLhKQbn:ZP8 Iܯ{eKG*uȝw;80+XI=YOա"M;} aՔ#|7?t%V]G*w#,%Is\QB'oDZc*A_S0Ceg_>}|2L y?{Md5ӭ1H妩hv2Ρg"B>7&RmAcp]=oyI ! ZJ }o,Dŭ( (^Sgc,1Ly", !ciW\b2,u);_ w{ :409d69n7삔J}},! ΣOϿ=1 dh~[6O'_zjbK)+G}ϧYDd.!TxU`ƳO8/Op\Kwq=:[F󤲵ҧ5ЏA9:N|_yL׈^1g\|Kk\ ߊ6DğW2QmWeM:ަIǃ$smrd>-C$.mݱw1Bh'/ Ļ40~Τ2ZPӵuZ?ث.zߟw`5^gpgnl*5yYu1Bg#A0mE's5VpZDtDynEy3 UkUy0Кټ=9xz VKOLEe0^ {uXvF>Reꇝwm,oG^r[[T|v@De鄟-Is \rԵ3:=/nVxV])w\x)r+u:0M^矪!P@-_}rt6iXoUm<qWk:n:w3M<|Y?K5nLi4f}৓MZAq;g0NK M-ITv>n t~t3r\[b^a8?Y&gqZb|! h-*鉽魴}">[]TF=!OoKɴ[ڭ h2DyK)v}h4":u߱vg9n˼*j&$o.k2U1w6L[` <\fqzۏ6KlG`wLL*gl*#-޽|ż_f:XA$S@y'9hun ʅ:'qhBB\nGdeYx@-0涮6h@׮3;08J hD Nn#Q69nԿDw>^"U*(PH缛8|qIH8"s6 kQkA:JaaXUyY<'~Db}BWc=M]E<2l8D-ܳRVh`u7Lkv>XxMVS`)ٍVToB!Xb5{%W@.E(H"CPD*cTZ#*4U.=<|^tpVz\Zֿ z%']ZY$n3B`d P-0uE  kL\|.L[G"rW2Y!,",JyyX"0d#|ŕ|Jc{@$< g`.l*F${Â=$ՔWu4ֻ8JΦC'c# oGWWm=]V@cG,tDA}!QRYZŔRE$nEeDS$`J(Rڍ ^ J $wV_N"/0ۜ<F܀fϤ!1HY2ty>j=( sΐ<W=b3 y:NQ=i}`5'T5חyh ` dWՓ%*q(2:X sxuC,UVyVAL#~ѣxq>sǍ_y8hsC Řgdl XY?{+[LP"[9wiyo 1YzRHs~"C8L7dTDUNNo `N[.1ѰC\yH%jJ*ґeg4TVʁ[fqΕrqI o7{i>Z|UEͥl[@aN(0&D..e1TvuT?]=xLIn"<"^WSRJ7Wm0[qT*3',ԏҢ"qDhً8{.[QT)ҸRlt,]ɚ{Rzș6QR+}^V *}$"F4"S0I+

*QeW4ݼj-(F0`& hp]=k-5[R #aHD!g5GIئf%YqPbOF^gӿҠ+}DEQeB6b$c 0UETR9? E)P5l#NFtMvPQ )b+U2%9>Y%ֆ%\1ЍN-GyȐQf0R<Du A 2b47@ɫD!RJDD!޵+"e;hw);y;INleߗlIvfEvK 2b*E+v`r S!aMn܍cN_:FZ . |7^+9!]]\NVp)ލǖh(aHvB0Ds;wtw7h{튳KeeŅWV@YJK//EH㝉֮h'K%Ia$^g}U i{c\X-sJS;i fB B֯PFiø$*BZ뽳֜(%hYqȃ/2` B )fds<{M,bAcNW^i1&v׬)n`8%h[u- 2YJɁ`wh[a]ʊ@8L e_ܕa-KϷ*=QJߟT\J@& L0+bjIp qPT }.o#,`2稄BR!ލ! 0w[  vfNS+_"M K(>7 !Y ız/t'ώH(Dջ+(z'*ȫr)>(Pfbo8ߖvwW 짵R -ϣ*o/LB?@iM29-X HGi}p!pbUS* $6DA|Fĺ%Nn1ڱyyuqOKw^Zs3SFoEEJǀ*r0%pZQ7nN.l}(D:_E f&?}@άÞ`} 4 E:o&HgeC2y:1fF (.iS*J╭%B1BUɐdY5FeQؽD 0q5aQDU(LXG˒p:F%p+ B9aKIs&9_vF #&#qT{|( Jj1r)i|Rcga,iE1U e|E$b  XN-YN;ٿ_SnF2C9p*Fu)ƁBᒀ7P"u '' 5ՠI mpjRW?.e0B?'}1oոeSdH &?D>R0>8GY߳?E3` _;dȯۛm (8~'J9b3z2]nC?Ӝ2Jl0Z@7('x=6$<)EzZ]nt F}49ek8 i=. a+ 4JYDK}:1O]_[};ѷ ܯ0""GЖ!b* DzS*& ܙ5|J3 3aA7 lqo=?/, W_ M1p~<3:^/ Bre' S:@ s3͒Z,Q)}YEøjQ0]%^\G_*sR[ +;/\ӎ.\9w-nF•46j[vr޵ށGBG{ m dpzP: ,-ALR9DԋMܵMAݦȝ_o!E-`{% ZVLU>ѥlZH"=O!L5\:ڢ3 *,w_Рoeɇa5+t՗|usvq畒f] }84pڇo~2 CCYW7vRW٭?} m>os 㩜иI#MG3?x[y٥ cG]c:R"Wer&71 ^1zf+;mrB޹&TލzT bL'm᢯][- Zwa!DSlJ4R2qnx7$=A餾w;*`ܠڻn9,䝛MHR~=ҭy:}R9zf{H-GH7s^CK,v \%`BYi ?G'KԚ#T)[ #0NJ ck 1(WUEJ#$Bc\I4%-uNζfiq5ky5zI_ky(^'c =3R޻M _h|53>% 1#LNl}cm h %xG[0*֥y_r,XhaNn.AA(e8\/';,)ud. O; +^…%+%B;D,/1oJt44S-4)Kb*b`S>dG$C$L/ *#^U]^mkMw!n୵=O#RsԶW e.iO'jO0P,#5].)Ϧ6HcS$[RNs<^0(tL2} 4LfLC Pm:mS `3ڞia<$8_ꫧbJ;SR5[٩էzWS:Ex֍x]ۻZ|Ps]ŲAguHG׈%, 6$ou= X(}`A&}ҿM"ItI1xex(#C>2P^w#F JƘQ)" T̔ ]ݬtEɈ+N`U1-lPJ(9ު5eb!܎Fc~=QII ob ޅRrgmI#ZɒxhACUTS&A`R{7,+9Dvj62p?,[VWwI*FŎw,~1m4~qo&ZA. qfӇYQxo{->Gb=nz863F|OF+AM^=S˅} pץўcoywC҇G HE!b6jU ~)+pRa8JV` !f%bHx3$4PцC``FS@|Odz- :0`^3-f>#̎E*1hÜ|8?t*Uoɂ2'簐wn)6:VDcq@A餾w;N>c"?( uT̍L/&$C#W#"d(5Zڣ4-ʼnBj嗘{+^k -+$`' C,"Pov2C^>T.S'^ -DQ@$#8JJ{!z-+QFñ[ndڲoQ[v8r<8-XY ځHu{>U4O-DQ#pxYt$łϱ:L" 픏sGH$)J˂I ]*W( GOŴ#"'V/N+V;N++-I/U^LJ[_Ͽ(28/\>.& GV[[(̦u>R`kz ~5LiԂ|S^9:-|ʳv|fK!VoI)T'tpdhpr#Q&Y oIԁE0E4f{] A1fB׺I{/DeTG(ƭbXۂA E+Rg~ Js5R1/Xp;\E77`RώwzEI":wwtQzwk_~ѵ"S1+;nPA^vm: BvX_qmSЙ+:T{kEQF?;M - uS䑞]4qǫd RLU&|3|W~8xj.,^i[*dT"M18Vœ9L4TRZ$-SF~驱AVԵEFgVDZ!iU .1A*G-21f@Qc!E)q":jWILQ ︴\5si-Жn[/g } %~Q/RHEgJaڨ\ D,!g8O!v5})krBkso b%ꜷu/Raśqi6r sv*=>Lj-U.]o7W؞/yu}a<D$?g$xawVbXjw*bW燴Dg,;f歞^m֥yY4P$ìԟUbUx}2mFsֶLGgfV!#чR"c %IW.z(\.Gjp9L;IQPlkOIW(njztWfp}.s%đĵLZ^!^)L\p8|B]lSA!l?w/ޭF( :~Hۮ (.% skMZ\6)RP̀:~|rFN}p]JFݺf7GKܰ:bR3/>8ōW=8l? zņf L,gxc9`植;cmtthNLCn[ɽ\-(֢!88BYF Wey6$(.[+*F}!(Z|Ggx[ dM*@'jJj8v2Ad HL"N步Ko|Yǃzzɗי|Yf=%HJB7d)I[LppgK qIGWc!3^N"9R֐D4o|u(XPx!>EbAޫDz3NCtVS66-B;7$yΆ$)]f[<$jHYCOjkSD$.*㹥z;2Ub )ﭹ%ܽNp1w7~q}NWCI)b#wR?MkSH껺 T/fX?G ;Oj|/IWwW`EIx&-2c76k kj4$0{\ݍU❱)'*D99uݹkdl+GN`UJx_pQўo2w);TܶSHәGV$i+]KDQ?:uUy-(ЀFEq|5 ST\D;Nq?8*O߯ʽ"6H/W8JK3.hX^_EQ):Ezo"/E^&[ꦁ`dA%[oCM[l Zc ?|zyEezy*@q_s6ūChu߼{lWj:W7nj?^p*Rnudj?ߵ7/߽xϖ///fzbw ȐhakhKcq`u%ҍ3T4.djw\Iq<%4Ԝ5ҍ* 'vQpC2l﮴*+cܺFlWG?B 5fpߞJ(?~zne[֮ [L u;HxIa2z%pRkb,tbx DPe 1dʌ,hTz{}YRFqFi"05"|joqBG\8$T*;KAR#C É:0d[P:9݀wS8Cwt&B۸Z=aO*mO.uD˼ k5RDwO'޽(*h`;_5N`;2K[ =ay׬ڣ7&} aHTz>f2p@F[fί-@}.S-tԿ,KeFQg #FSl}vn{לN:=)ϘI}EU]x@ZtH}`V;O#Dy˥\|2HBN|24!1_,2 mX<Ӯ͘wEDqwztsuo,/Y^&eSw< (IPMz.Mۂ:'´U~hkBK>&Gc5osTNP1"- RB+$QG-ĽtaAz N HvPDĿkp D " j_%Ƴ*f`ҹoE !؈$4!Հmto+ pRxg|@~|sm0 H%튕ԼΗٴnp2XJC1j>#Lr$fk?lӳ)~*4V0 G~xVukW\"h4q6)d8  Xw Кw=V^$jz tIu7AaUc=Fp@Z6͞oܟsޞyJͺ)`*=$ކ<}^4ᗌJ1q̺H.3B^q~y- (lL' L8GȹA>#7BKe/<I%f|➌=k'cyx?G0%J⌔G{lJӳkq#QhQa|JYW2JBpFg3 iT{}eߠa cW'5,]pWMgRj  V~Pv9/MXv?+lf})PqPWml{!_\{burq2|&ZeS(~%nJsnN7JvIZi2hM^,\wCQNлbc:(n'N]\K1d9Q;dph9 ` 3cmmmLeN}G.<RRtK>>P8ק_M %3U&Dsm#`VQdYSJ޾:e=:jmGit5FY,}Tj\qX,e})boqFY_<Ɯh֋}MUhA Oh@~D/wT. ZNZݨo|He/Eg_@:Z.W.T}IWJt+;V z$AڲD@fnJ&szkS2B \8i@\YABse͛WYFS ~'VRP)dd:`rްh%9Y`w@Vyv˪3e Jf!QtZpġ#oS[N's/$LLuk㬇g2)QBΜPJ ΐ)O씇ӳn1Jy(c<'GOyՍOfyFOU||Knʸ>t~x6O?wW}~ҡ)ÙqŃ99F!:#3[._ɲ{1NYyZ#*Qs}ǀjt;4_u[0Bٻ6W/7~ΞAEW6E$%_T)HyFӤhڀMJ3=UUץ;>3bXD'3DB3oX»9teE\6:ЮOA~UNUrzYoK Z휓z}?5_Gܐι'/. V0际cF/n80x7p:o E)It]4=2: k+Mr;cCi ̨ѼE -\Q'}HhaPyWK]F8+ V9SőI%{N*rcA(.) i=d䥷+渐*IFw>y$EJ]K aR%R1)'UR^O?8Iػ]T[8M*wXaK*4Sd6M&~)N:=O u>fQEohi׏ZNYͷr=Prdo{udCH ?W,#A=aMguJ m:]|~њ8\(JO@z}9f}ewkTJQa9'E tR  V[yRo͡[WMu< KΑҚ9o InOVZ:\|Ȫo]{/nIʩa'j웭>9}3E9PQݣ9j;G1l8qNbRFm4wlz86Ti ݱTs߬rH W/R1F?cnԓ;;)ˀpוTmOԝH0绞~\w2K7(z[3>}d>KV?ڝMtЖkUR^#[!xxkZwnj9W7%izWj;}xuzտI1-~I\瞖P[4YZC͚}_l]<+Wxz`Fp묏nWz6;֕ɭ'?_JV+Iȟ\DȔpvS݆Ҡ }FޗR|'z;vv!!r$S;-|݌i@t2w{(n;w*Hȟ\Dɔ~*O޺cTԂ|^vn)|/2¡_}K)d"<^,aC;gtm0AʲwhCf>pg Ԝq{ [ i& =~99:ev8TVԶ VࡲzXJ&+ 7h]/$^JAt4 LfsP\+*dBJu7x#$@@,-S9Z,kSc9`vkpTqit5#GIYb}'b͌dM҆ PXV#PJ9;ʦėTk|@3)+-;Epō0mՑ'h׹a䓿)[ Y\"$E&62j<a(D)="7i%"a"ƲpSVDh))蠝67OjdƋU^~%/\tX(Aq:y^ǰCR!yi]^fr?Z#Ggo R8 F T*y5pqJswP)gz8ЏXN1E#èo€=%.e)Hi^3P:)tIL孞m|k.QJ""ۃC{Hy*q_4ВtojVr#8' HJHgki(Z9ZcZS )b ҆ё)ס{o5Ն3 :e94wzEPp&H JSK:~SrqE"ˬD<+[f&gF:Pzr# ; 6̝vᦃj;2VvLeZD$- Dntf"8ql;ث '`J<$ h#nwPq$K PFQ'm-Kd0 ˂Umqqu̢nq= |+ꄦ>pp-”KiƲxpMhM3Z9SN3u04`ٱ&c9FL)O$d$m͓RݩnږJ[7%ϿۑRYaKL)egY^i07M[fwKy1%|RJ wzL|s1%ķ~ 깛O /W"|$5uK"",pzX9d۱<_j8K)JXJ_ƫp}`NvD'/ù `tZS3N>D:|}}I•,!&'4C}u۝\ItSc5ȃ&K}t$17Svk)aS_0ts텈Ț%.dCX(kh ~5 p- .,I4#e"";Q%="g]Cq T6mVK JIe!f$ !zb3^ C`"< v=fmi#mt6<sDVT*MNpoid`dYlŧ:Sl/RZ{_tS䑀m!D6Dr8ဓ$NV:M6˄av+bcUnjltAo_J1,l$ ALln瞤}5\jz <2([;b\xb3L( hkD D (ətu(M4Ca=|FR)#,2DN9ętAKpy(m0@g4ZL> #ż$8UoJ*x`(Gyԡ4\54}-EøwFqP%MUg*AX%S΁Z~)Վ !ŷ~z'wˇH2]8'0dkNI {3uc[-[(dĪrVH@.u$El?'wO$N u.x֏jd_j< ))GK䀻Jge,M46Y:}B(}ψy=pi-9=#\͌BvuVRg8ǴK] V1肋s`V곘b[dhf>+puk2 ̗ƥJ+k{ȯyt:VsI χa|j |`HutGƀp6*! :쎊+?TiFP*j f}s`FڇWiJnKOR<5" /^#xd)c&}4"x(eڎ*rEVn" vK,ؤVu :umq;X]{Ɩ{X6(KW>z@*`ˑ58ZyLTma,<U) +;7sѫ~?G&$F8P87b!Y޼ftI{)ޤJVIGHH`  Dmp4viSd wQQۏyw,e8YـTzAӷ|"Mђ ~CߓܤJ/'a _'u[!|'U-wQ$av/3.r9~Ìƨѳy"]<]aII{?N aK}s+j<OzfN2e=l1lR~\3); o9)|.Wʖ)-3R]u#yK}}-2G1#I_puìpLxIu Aj=1}3 ٤A!4$P`UYpq3(u3+ppy19Z$_.&OG|-@vml8?0xcx 5G?brw[,>臱F%$筡Xt׫Jk#L3+} 3OE~8|d2aҾ<.h9 FȜD"ԓRb[.1)fet:|g7A؝ĩgJ'<9$ռ5j1NKsFT- :.09f7QXă 8Xp3n0WܲS x/GwɃo.Grsa*뒒Q*faEE!$P(N[CGtȇ=ĭ'L wT\{ )I]p0&ifRLaR''d"ٸ.@̭ xM>db |3H-݆Z)JGm9ښHt$Zaiѭva_v¬ ~/~:5R3du"U~z`j.)4.ouO["Fֺ=NfDlmeIآJH[[>ptp8?_d(<vP"Ro;amS cb3Zӡ.Iߏ~]-Xw\+櫅5Fnr:'(77Ҩup&1JDĹeplĀeL|X0 ˯^T-ty 9ހo 3sw\~'1Vpef$\G`&&fB)I-߬c\XvW3KlS6iPQHVs /]rzW{9&RH\斨P<ؽXdhVl@+4H&r=, ͓A !+/R=dXblK '=S"e<ѻvE:)V)xs]9/'HvHpiCVȥӫ qq4C7rZv)L+-W,I%"-ֱhE^Ir/h+'bP*rn&*`h!wd f REa_>\+%0LG :%-{]e=ۃL )3LGdף"ߤZYL>m @weTo-jgh9b)2#*<fy'*E|JA'\c)` t7 nv&?HU\rz*o{a$8<0Pj[z<;D_<|-|e~R7r\(k T4!⢇X7dkgvz#vN &W'kP0DBgCF4YR>YkQi':TFhS)$j880y΄zWGG(+qFϠ/( xw|zV%NxKA?OXrCW/Yvׂ?H-X;W7Ks$S*stBxU9) ?< :z@Ve#v}g@2pVxOlJM 7Zq?#|oHW x>+@)::+ VGl _7ܫߑ "6ǔyt`?QҺh%]uDTwy5zF&֑L**ru@)~DW8_ήpmz|9{3g#@*;==M*iTdQL|AIWt[3wg;k<Ĵ_ 9ӓ]Hcȉ̂SւDZi}^qk 1 Ir~v:WX_gIPy98l!PL&Fw̞2bϧSНD2T.d!e N 5|fsCdmĭZR 5i?9β.4S Lx^V e"{g_Iǹ|,+O‘0u&"bg~{4ƍO7$ko$6>馼zuqA_ۃ7xp9Y}:$Ed>:Ōv kغHmbGeJW)k (W ن`? ra,HlȎJO'Ă1:2#,QM[s6[d-}5_6TjM0~]HbG' U)"i$p7,J^s ?Crj%tݪIF_EkX#'Q){@V+14OlLffDb cL%8|+x5]=1v2Z-08D_hAFNQ򀗮XVO L[#T8=xW=j%|Cm$z~덪/*,.r6uXqEIv gh)J8^y--` t#ӦudMԚp֔xM>ݜђN=~+܎y}( $Y!ba)u/XKX`dpգVX'%Ő*l/bu>zjKyklcQJj.{vE;b!ND%E@Cc!絝RÛ3X@o)~J%?ǛT%MvDi3}%pP]GE!q2JX03b! XqpVA\ S#x2 *"3:V=!5J9o%TBEt$gJ*oxS"lA;)w1B\*kw 60V"^1YŨ,JQsq<) F咽pBu,ATbRE8nGZ/9P?Nr> z )]GUJ_v+n T2A0dlϫ)Ԥᤂ0AAr20S ex@h#Fm>]Fl-0&gvF=UDZBܮ" ZؐTS¯ƥ;ht?Mzt!*!B_bv§kX~.,HQ_nq˾>f ekBIW'R*OB)6B-iELr8#=1KC+Y qAKhhBg)U|OKŴZ/u)!Š{}\pVs];c~3 eŧkfq ;9yr(.܁ֳ`$\Y._MBrşnR',{b))nRpZYM=8\A~Ͱ  q*RN7Ao_x3 X<V@Xq}*J.:bDVIċ5i'I.'bmG0i‘jdfu>^aƊ8,q+F(zh?fg(zb*Xž {.~8L̨_P8ykbYM3xO7A Q"#z!2;jG'L3 |NY3H|Z{OHrA s#->CD:4wﶝ]D _~O;f2娇ZlUeO9[!;|E*\1,Ȏqؙˇ_oַt=Va8nG_ڲtLgt9OL0p wZpU f.8C?F+1g}QXWf uglj2ggRӵi#bƛ1J$<1ZpS6|OmDDһ'1"[iΎ_z7Kpknc?>.^UK@y.A﫷w~^ܞa:*qWkߙ+} 8X'L'_?A3-̳eI?< ;Wjm3Esg)n,"l=+LǤYP"r{7A aU1ZmXT+ӓv#Q${b2Amw:;};Lf69f7RWg4!$P'qgs\a{KT+i(s2Z\3b9W|3{i+@NYk!s#; FBRT`kp /nD9.IXXo(7qm|-;ϝv |8"-(A`L]_ZW⬁ޢq&ָp|k(+>@sՙy'YQv{A#s;?'qa ZZ瞻@c$ !3fגYY낵dcA+$+n H-4-Q+=`- Mm·b\˿ Pzs"z&"FB,E9br75VE,=uG>MoHAu75)QI|5ޝȫΦ1y[4yA{9w޾9bۗʫKiv޾tqjRIVڗji"oE՚PAtW{Z骖p2D5e+QRi Iyw{vxyf<偌w~NY4C9?w;]G*6 ł 6$///:u⃦z bɴ53xZQ#/g/t6;"yM%Ͼ2rVT9)ʑ map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 10 09:03:56 crc kubenswrapper[4883]: body: Mar 10 09:03:56 crc kubenswrapper[4883]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:44.89144587 +0000 UTC m=+11.146343769,LastTimestamp:2026-03-10 09:03:44.89144587 +0000 UTC m=+11.146343769,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:03:56 crc kubenswrapper[4883]: > Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.445672 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f749a0809ba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:44.891529658 +0000 UTC m=+11.146427557,LastTimestamp:2026-03-10 09:03:44.891529658 +0000 UTC m=+11.146427557,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.448677 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 09:03:56 crc kubenswrapper[4883]: &Event{ObjectMeta:{kube-apiserver-crc.189b6f74e40d3446 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 09:03:56 crc kubenswrapper[4883]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 09:03:56 crc kubenswrapper[4883]: Mar 10 09:03:56 crc kubenswrapper[4883]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:46.133382214 +0000 UTC m=+12.388280103,LastTimestamp:2026-03-10 09:03:46.133382214 +0000 UTC m=+12.388280103,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:03:56 crc kubenswrapper[4883]: > Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.451810 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f74e40db403 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:46.133414915 +0000 UTC m=+12.388312804,LastTimestamp:2026-03-10 09:03:46.133414915 +0000 UTC m=+12.388312804,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.454683 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b6f74e40d3446\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 09:03:56 crc kubenswrapper[4883]: &Event{ObjectMeta:{kube-apiserver-crc.189b6f74e40d3446 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 09:03:56 crc kubenswrapper[4883]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 09:03:56 crc kubenswrapper[4883]: Mar 10 09:03:56 crc kubenswrapper[4883]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:46.133382214 +0000 UTC m=+12.388280103,LastTimestamp:2026-03-10 09:03:46.13835916 +0000 UTC m=+12.393257049,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:03:56 crc kubenswrapper[4883]: > Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.457314 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b6f74e40db403\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f74e40db403 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:46.133414915 +0000 UTC m=+12.388312804,LastTimestamp:2026-03-10 09:03:46.138385008 +0000 UTC m=+12.393282897,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.460557 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b6f72764cefaf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f72764cefaf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.702130607 +0000 UTC m=+1.957028495,LastTimestamp:2026-03-10 09:03:47.152219812 +0000 UTC m=+13.407117701,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.463582 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b6f72803847c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f72803847c9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.868549065 +0000 UTC m=+2.123446944,LastTimestamp:2026-03-10 09:03:47.265834405 +0000 UTC m=+13.520732295,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.466575 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b6f7280919383\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f7280919383 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.874401155 +0000 UTC m=+2.129299045,LastTimestamp:2026-03-10 09:03:47.271662585 +0000 UTC m=+13.526560474,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.470675 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 09:03:56 crc kubenswrapper[4883]: &Event{ObjectMeta:{kube-controller-manager-crc.189b6f76ee1e44d1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 09:03:56 crc kubenswrapper[4883]: body: Mar 10 09:03:56 crc kubenswrapper[4883]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:54.892207313 +0000 UTC m=+21.147105203,LastTimestamp:2026-03-10 09:03:54.892207313 +0000 UTC m=+21.147105203,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:03:56 crc kubenswrapper[4883]: > Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.473510 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f76ee20538d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:54.892342157 +0000 UTC m=+21.147240046,LastTimestamp:2026-03-10 09:03:54.892342157 +0000 UTC m=+21.147240046,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.044513 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.372673 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.372864 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.373838 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.373885 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.373898 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.374353 4883 scope.go:117] "RemoveContainer" containerID="c7b4f664df3f11468d9d8594d164f3449f4a5c330b0a3cfc966a0a710814ac2f" Mar 10 09:03:57 crc kubenswrapper[4883]: E0310 09:03:57.374531 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:03:58 crc kubenswrapper[4883]: I0310 09:03:58.043595 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:03:58 crc kubenswrapper[4883]: W0310 09:03:58.075390 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 09:03:58 crc kubenswrapper[4883]: E0310 09:03:58.075461 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 09:03:58 crc kubenswrapper[4883]: I0310 09:03:58.369757 4883 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 09:03:58 crc kubenswrapper[4883]: I0310 09:03:58.381400 4883 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 09:03:59 crc kubenswrapper[4883]: I0310 09:03:59.043965 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:03:59 crc kubenswrapper[4883]: W0310 09:03:59.314264 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 09:03:59 crc kubenswrapper[4883]: E0310 09:03:59.314321 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:00 crc kubenswrapper[4883]: I0310 09:04:00.043298 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:01 crc kubenswrapper[4883]: I0310 09:04:01.043427 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:01 crc kubenswrapper[4883]: W0310 09:04:01.273431 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 09:04:01 crc kubenswrapper[4883]: E0310 09:04:01.273514 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.043397 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.177579 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.177737 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.178669 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.178710 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.178719 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.181070 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.189746 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.190546 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.190581 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.190594 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:02 crc kubenswrapper[4883]: W0310 09:04:02.705874 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 09:04:02 crc kubenswrapper[4883]: E0310 09:04:02.705926 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:02 crc kubenswrapper[4883]: E0310 09:04:02.736238 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.744101 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.745230 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.745291 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.745306 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.745338 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:04:02 crc kubenswrapper[4883]: E0310 09:04:02.749515 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 09:04:03 crc kubenswrapper[4883]: I0310 09:04:03.050538 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:04 crc kubenswrapper[4883]: I0310 09:04:04.044341 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:04 crc kubenswrapper[4883]: E0310 09:04:04.134655 4883 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:04:05 crc kubenswrapper[4883]: I0310 09:04:05.041387 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:06 crc kubenswrapper[4883]: I0310 09:04:06.044197 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:07 crc kubenswrapper[4883]: I0310 09:04:07.044318 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:08 crc kubenswrapper[4883]: I0310 09:04:08.044825 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:09 crc kubenswrapper[4883]: I0310 09:04:09.043867 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:09 crc kubenswrapper[4883]: E0310 09:04:09.739990 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 09:04:09 crc kubenswrapper[4883]: I0310 09:04:09.750225 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:09 crc kubenswrapper[4883]: I0310 09:04:09.751390 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:09 crc kubenswrapper[4883]: I0310 09:04:09.751435 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:09 crc kubenswrapper[4883]: I0310 09:04:09.751449 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:09 crc kubenswrapper[4883]: I0310 09:04:09.751498 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:04:09 crc kubenswrapper[4883]: E0310 09:04:09.754554 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 09:04:10 crc kubenswrapper[4883]: I0310 09:04:10.043748 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.043362 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.079868 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.080787 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.080815 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.080825 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.081250 4883 scope.go:117] "RemoveContainer" containerID="c7b4f664df3f11468d9d8594d164f3449f4a5c330b0a3cfc966a0a710814ac2f" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.214034 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.215352 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13"} Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.215463 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.216089 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.216109 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.216117 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.043726 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.218704 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.219189 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.220804 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" exitCode=255 Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.220839 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13"} Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.220871 4883 scope.go:117] "RemoveContainer" containerID="c7b4f664df3f11468d9d8594d164f3449f4a5c330b0a3cfc966a0a710814ac2f" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.221075 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.226350 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.226396 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.226407 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.227043 4883 scope.go:117] "RemoveContainer" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" Mar 10 09:04:12 crc kubenswrapper[4883]: E0310 09:04:12.227915 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:13 crc kubenswrapper[4883]: I0310 09:04:13.041777 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:13 crc kubenswrapper[4883]: I0310 09:04:13.225906 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 09:04:14 crc kubenswrapper[4883]: I0310 09:04:14.043654 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:14 crc kubenswrapper[4883]: E0310 09:04:14.134783 4883 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:04:15 crc kubenswrapper[4883]: I0310 09:04:15.043531 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:15 crc kubenswrapper[4883]: W0310 09:04:15.691990 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 09:04:15 crc kubenswrapper[4883]: E0310 09:04:15.692042 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:15 crc kubenswrapper[4883]: W0310 09:04:15.780808 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 09:04:15 crc kubenswrapper[4883]: E0310 09:04:15.780839 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:16 crc kubenswrapper[4883]: I0310 09:04:16.043831 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:16 crc kubenswrapper[4883]: E0310 09:04:16.743543 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 09:04:16 crc kubenswrapper[4883]: I0310 09:04:16.755680 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:16 crc kubenswrapper[4883]: I0310 09:04:16.756859 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:16 crc kubenswrapper[4883]: I0310 09:04:16.756903 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:16 crc kubenswrapper[4883]: I0310 09:04:16.756914 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:16 crc kubenswrapper[4883]: I0310 09:04:16.756940 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:04:16 crc kubenswrapper[4883]: E0310 09:04:16.760057 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.043612 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.373177 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.373366 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.374253 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.374285 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.374295 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.374714 4883 scope.go:117] "RemoveContainer" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" Mar 10 09:04:17 crc kubenswrapper[4883]: E0310 09:04:17.374863 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:17 crc kubenswrapper[4883]: W0310 09:04:17.684528 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:17 crc kubenswrapper[4883]: E0310 09:04:17.684575 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.043652 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.514503 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.514651 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.515778 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.515819 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.515831 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.516383 4883 scope.go:117] "RemoveContainer" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" Mar 10 09:04:18 crc kubenswrapper[4883]: E0310 09:04:18.516594 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:19 crc kubenswrapper[4883]: I0310 09:04:19.047680 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:20 crc kubenswrapper[4883]: I0310 09:04:20.044408 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:21 crc kubenswrapper[4883]: I0310 09:04:21.044177 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:21 crc kubenswrapper[4883]: W0310 09:04:21.518528 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 09:04:21 crc kubenswrapper[4883]: E0310 09:04:21.518579 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:22 crc kubenswrapper[4883]: I0310 09:04:22.043579 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:23 crc kubenswrapper[4883]: I0310 09:04:23.043314 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:23 crc kubenswrapper[4883]: E0310 09:04:23.746448 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 09:04:23 crc kubenswrapper[4883]: I0310 09:04:23.760694 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:23 crc kubenswrapper[4883]: I0310 09:04:23.761734 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:23 crc kubenswrapper[4883]: I0310 09:04:23.761775 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:23 crc kubenswrapper[4883]: I0310 09:04:23.761786 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:23 crc kubenswrapper[4883]: I0310 09:04:23.761811 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:04:23 crc kubenswrapper[4883]: E0310 09:04:23.765032 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 09:04:24 crc kubenswrapper[4883]: I0310 09:04:24.044317 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:24 crc kubenswrapper[4883]: E0310 09:04:24.135308 4883 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:04:25 crc kubenswrapper[4883]: I0310 09:04:25.043764 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:26 crc kubenswrapper[4883]: I0310 09:04:26.044161 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:27 crc kubenswrapper[4883]: I0310 09:04:27.043916 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:27 crc kubenswrapper[4883]: I0310 09:04:27.700130 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:04:27 crc kubenswrapper[4883]: I0310 09:04:27.700333 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:27 crc kubenswrapper[4883]: I0310 09:04:27.701332 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:27 crc kubenswrapper[4883]: I0310 09:04:27.701362 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:27 crc kubenswrapper[4883]: I0310 09:04:27.701396 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:28 crc kubenswrapper[4883]: I0310 09:04:28.043998 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:29 crc kubenswrapper[4883]: I0310 09:04:29.046692 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:30 crc kubenswrapper[4883]: I0310 09:04:30.043927 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:30 crc kubenswrapper[4883]: E0310 09:04:30.749995 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 09:04:30 crc kubenswrapper[4883]: I0310 09:04:30.766084 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:30 crc kubenswrapper[4883]: I0310 09:04:30.767043 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:30 crc kubenswrapper[4883]: I0310 09:04:30.767076 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:30 crc kubenswrapper[4883]: I0310 09:04:30.767088 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:30 crc kubenswrapper[4883]: I0310 09:04:30.767111 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:04:30 crc kubenswrapper[4883]: E0310 09:04:30.770727 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 09:04:31 crc kubenswrapper[4883]: I0310 09:04:31.043297 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:31 crc kubenswrapper[4883]: I0310 09:04:31.079119 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:31 crc kubenswrapper[4883]: I0310 09:04:31.079988 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:31 crc kubenswrapper[4883]: I0310 09:04:31.080016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:31 crc kubenswrapper[4883]: I0310 09:04:31.080024 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:31 crc kubenswrapper[4883]: I0310 09:04:31.080397 4883 scope.go:117] "RemoveContainer" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" Mar 10 09:04:31 crc kubenswrapper[4883]: E0310 09:04:31.080578 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:32 crc kubenswrapper[4883]: I0310 09:04:32.044033 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:33 crc kubenswrapper[4883]: I0310 09:04:33.044229 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:34 crc kubenswrapper[4883]: I0310 09:04:34.043822 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:34 crc kubenswrapper[4883]: E0310 09:04:34.136340 4883 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:04:35 crc kubenswrapper[4883]: I0310 09:04:35.044237 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:36 crc kubenswrapper[4883]: I0310 09:04:36.044458 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.043623 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.402326 4883 csr.go:261] certificate signing request csr-j5rsq is approved, waiting to be issued Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.408033 4883 csr.go:257] certificate signing request csr-j5rsq is issued Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.477880 4883 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.771282 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.773029 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.773077 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.773088 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.773251 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.779982 4883 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.780244 4883 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.780266 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.783144 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.783182 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.783193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.783213 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.783260 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:37Z","lastTransitionTime":"2026-03-10T09:04:37Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.793682 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.799237 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.799268 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.799279 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.799297 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.799307 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:37Z","lastTransitionTime":"2026-03-10T09:04:37Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.806074 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.811617 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.811643 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.811652 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.811663 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.811670 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:37Z","lastTransitionTime":"2026-03-10T09:04:37Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.818908 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.823826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.823853 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.823862 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.823879 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.823889 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:37Z","lastTransitionTime":"2026-03-10T09:04:37Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.832073 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.832185 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.832214 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.932688 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.963296 4883 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.033280 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.133773 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.234752 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.335196 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: I0310 09:04:38.409406 4883 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-13 02:37:17.612067211 +0000 UTC Mar 10 09:04:38 crc kubenswrapper[4883]: I0310 09:04:38.409462 4883 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6665h32m39.20260855s for next certificate rotation Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.435821 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.536520 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.637078 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.737880 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.838636 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.939407 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.039979 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.140752 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.241279 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.341667 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.442790 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.542870 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.643591 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.744393 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.845227 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.946101 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.046625 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.147610 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.248039 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.349033 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.449954 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.551068 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.651938 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.752452 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.853426 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.954262 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.054970 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.155387 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.256141 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.356994 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.457931 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.558247 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.659082 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.759633 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.860740 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.960910 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.061383 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.162274 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.263405 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.363810 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.464527 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.565117 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.665526 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.766218 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.867138 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.968133 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.068756 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.169166 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.269594 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.370665 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.471566 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.572503 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.673396 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.774186 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.874825 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.975881 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.076882 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.136613 4883 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.177789 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.278626 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.379411 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.480213 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.580275 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.681423 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.781925 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.883024 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.983314 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.079164 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.080363 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.080399 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.080409 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.081143 4883 scope.go:117] "RemoveContainer" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.083722 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.184625 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.285611 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.307350 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.308994 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a"} Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.309207 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.310144 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.310177 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.310188 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.385956 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.486847 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.587939 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.688724 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.789833 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.890508 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.990570 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:46 crc kubenswrapper[4883]: E0310 09:04:46.091433 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:46 crc kubenswrapper[4883]: E0310 09:04:46.192346 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:46 crc kubenswrapper[4883]: E0310 09:04:46.292988 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.313111 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.313611 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.315233 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" exitCode=255 Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.315284 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a"} Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.315345 4883 scope.go:117] "RemoveContainer" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.315498 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.316490 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.316522 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.316531 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.317137 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:04:46 crc kubenswrapper[4883]: E0310 09:04:46.317301 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:46 crc kubenswrapper[4883]: E0310 09:04:46.393526 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:46 crc kubenswrapper[4883]: E0310 09:04:46.493639 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.576001 4883 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.595969 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.596012 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.596023 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.596041 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.596053 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:46Z","lastTransitionTime":"2026-03-10T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.698156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.698183 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.698193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.698210 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.698223 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:46Z","lastTransitionTime":"2026-03-10T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.800115 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.800151 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.800160 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.800181 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.800190 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:46Z","lastTransitionTime":"2026-03-10T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.902255 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.902286 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.902319 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.902338 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.902348 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:46Z","lastTransitionTime":"2026-03-10T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.003939 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.003977 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.003986 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.003996 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.004004 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.071161 4883 apiserver.go:52] "Watching apiserver" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.077009 4883 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.078425 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-7xb47","openshift-network-diagnostics/network-check-target-xd92c","openshift-multus/multus-p898z","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-ovn-kubernetes/ovnkube-node-pzdml","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9","openshift-image-registry/node-ca-vvbjw","openshift-machine-config-operator/machine-config-daemon-zxzn8","openshift-multus/multus-additional-cni-plugins-nrzgf","openshift-multus/network-metrics-daemon-gmq5n","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.078894 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.079040 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.079094 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.079229 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.079217 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.080054 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.080330 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.080609 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.080620 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.080747 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.081788 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.081915 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.081946 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.081985 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.082022 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.082053 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.082313 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.082688 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.082820 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.084718 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.084884 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.085200 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.085418 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.085507 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.085598 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.085606 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086022 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086316 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086374 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086560 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086583 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086727 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086972 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087090 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087185 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087365 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087511 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087527 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087553 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087602 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087607 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087624 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087633 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087638 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087733 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087748 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087752 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087764 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087751 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087813 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087847 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087848 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.088069 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.088132 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.088243 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.099598 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.108888 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.109016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.109076 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.109146 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.109216 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.116348 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.122978 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.132605 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.134780 4883 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.141296 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.144284 4883 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.149082 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.161579 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.171327 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.181442 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.192257 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.199831 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.210188 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.211794 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.211819 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.211828 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.211841 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.211850 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.222540 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.230837 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.238551 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.239763 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.239867 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.239945 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240037 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240118 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240199 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240264 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240327 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240391 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240119 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240454 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240489 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240272 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240299 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240138 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240660 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240713 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240712 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240765 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240788 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240947 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241023 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241084 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241100 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241221 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241284 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241345 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241407 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241467 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241577 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241178 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241224 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241539 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241600 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241703 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241708 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241757 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241764 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241796 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241896 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241928 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242009 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242081 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242159 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242228 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242300 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242365 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242437 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242540 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242384 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242551 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242565 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242663 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242597 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242621 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242727 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242739 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242765 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242789 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242809 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242823 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242839 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242857 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242873 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242889 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242904 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242922 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242937 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242952 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242981 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242989 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242998 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243012 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243018 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243032 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243049 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243066 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243080 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243105 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243121 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243138 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243154 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243168 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243186 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243192 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243202 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243202 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243245 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243266 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243285 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243303 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243320 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243341 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243344 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243360 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243379 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243396 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243412 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243435 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243452 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243468 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243505 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243522 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243541 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243562 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243577 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243595 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243597 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243612 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243617 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243628 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243646 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243663 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243680 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243696 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243713 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243729 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243746 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243740 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243763 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243767 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243779 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243796 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243811 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243827 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243845 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243861 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243877 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243894 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243910 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243920 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243927 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243924 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244058 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244083 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244103 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244119 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244135 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244278 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244376 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244391 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244642 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244716 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244748 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244766 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244782 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244797 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244814 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244828 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244843 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244860 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244878 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244898 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244914 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244928 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244943 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244967 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244982 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244999 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245012 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245026 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245041 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245056 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245070 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245085 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245100 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245116 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245130 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245143 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245157 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245170 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245924 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245948 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245974 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245988 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246006 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246023 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246040 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246059 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246078 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246098 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246168 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246187 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246207 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246228 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246244 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246261 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246277 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246293 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246308 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246327 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246341 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246359 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246376 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246391 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246412 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246427 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249391 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245236 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245383 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245503 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245594 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245823 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245853 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246428 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.246447 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:04:47.746430164 +0000 UTC m=+74.001328052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250165 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250196 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250217 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250235 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250253 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250278 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250296 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250311 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250432 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250498 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250531 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250578 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250597 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250616 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250631 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250652 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250669 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250685 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250702 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250719 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250737 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250758 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250774 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250791 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250809 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250825 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250842 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250864 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250879 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250896 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250912 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250928 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250943 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250972 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250988 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251006 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251022 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251039 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251056 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251072 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251089 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251108 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251182 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6c845e62-37a1-473c-a4d0-a354594903bc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251208 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdjf\" (UniqueName: \"kubernetes.io/projected/6c845e62-37a1-473c-a4d0-a354594903bc-kube-api-access-9zdjf\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251231 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251249 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99873383-15b6-42ee-a65f-7917294d2e02-mcd-auth-proxy-config\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251269 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-cnibin\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251285 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-os-release\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251301 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251320 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251339 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-cni-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251355 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251371 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-script-lib\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251388 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5fd36c79-e84e-49aa-97b9-616563193cd2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251403 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99873383-15b6-42ee-a65f-7917294d2e02-proxy-tls\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251423 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-slash\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251438 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-systemd-units\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251453 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h98t5\" (UniqueName: \"kubernetes.io/projected/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-kube-api-access-h98t5\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251470 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-netns\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251500 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53ffac75-0989-4945-915d-4aacec270cdb-host\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251515 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c845e62-37a1-473c-a4d0-a354594903bc-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251537 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251557 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251573 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsr4f\" (UniqueName: \"kubernetes.io/projected/53ffac75-0989-4945-915d-4aacec270cdb-kube-api-access-qsr4f\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251589 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-systemd\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251626 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-multus-certs\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251643 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-system-cni-dir\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251658 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5fd36c79-e84e-49aa-97b9-616563193cd2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251674 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251696 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251712 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-var-lib-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251727 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-ovn\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251745 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-hostroot\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251760 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e883c29-520e-4b1f-b49c-3df10450d467-multus-daemon-config\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251782 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251800 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251817 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251835 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-cnibin\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251852 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-kubelet\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251869 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-log-socket\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251887 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-os-release\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251904 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-socket-dir-parent\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251922 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-bin\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251937 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovn-node-metrics-cert\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251953 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d41077f5-9f66-4be5-bb1a-e0f5b2b078e0-hosts-file\") pod \"node-resolver-7xb47\" (UID: \"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\") " pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252181 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64vnf\" (UniqueName: \"kubernetes.io/projected/bd6597a3-f861-4126-933e-d6134c8bd4b5-kube-api-access-64vnf\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252215 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252238 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252257 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-node-log\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252275 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-conf-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252292 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53ffac75-0989-4945-915d-4aacec270cdb-serviceca\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252332 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-etc-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252349 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-netd\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252367 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-system-cni-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252383 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252401 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252417 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqn66\" (UniqueName: \"kubernetes.io/projected/d41077f5-9f66-4be5-bb1a-e0f5b2b078e0-kube-api-access-fqn66\") pod \"node-resolver-7xb47\" (UID: \"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\") " pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252433 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/99873383-15b6-42ee-a65f-7917294d2e02-rootfs\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252450 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252464 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-kubelet\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252492 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-netns\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252509 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-etc-kubernetes\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252526 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58nsm\" (UniqueName: \"kubernetes.io/projected/99873383-15b6-42ee-a65f-7917294d2e02-kube-api-access-58nsm\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252543 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5fd36c79-e84e-49aa-97b9-616563193cd2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252559 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-cni-bin\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252573 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-cni-multus\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252590 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-config\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252604 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-k8s-cni-cncf-io\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252622 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2lkr\" (UniqueName: \"kubernetes.io/projected/5fd36c79-e84e-49aa-97b9-616563193cd2-kube-api-access-v2lkr\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252638 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e883c29-520e-4b1f-b49c-3df10450d467-cni-binary-copy\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252655 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wszn\" (UniqueName: \"kubernetes.io/projected/8e883c29-520e-4b1f-b49c-3df10450d467-kube-api-access-2wszn\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252675 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252693 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252708 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252723 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-env-overrides\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252824 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252837 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252847 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252856 4883 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252866 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252898 4883 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252908 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252918 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252929 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252938 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252948 4883 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252967 4883 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252976 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252985 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252995 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253007 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253017 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253027 4883 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253037 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253048 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253059 4883 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253070 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253079 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253089 4883 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253102 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253111 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253120 4883 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253130 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253139 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253149 4883 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253157 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253166 4883 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253176 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253184 4883 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253194 4883 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253203 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253212 4883 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253221 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253230 4883 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253240 4883 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253249 4883 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253259 4883 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253268 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253278 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254395 4883 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.255695 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.257793 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.258014 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250861 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250874 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246547 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246843 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246942 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246942 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246820 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247353 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247432 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247438 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247522 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247534 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247697 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247808 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247839 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248117 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248145 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248357 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248413 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248501 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248724 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248715 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248572 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248868 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249059 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249070 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249718 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249847 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249887 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249906 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249917 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250066 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250900 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251108 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251320 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251502 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251535 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251648 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251683 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251720 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251885 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251928 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252159 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252270 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.259500 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252284 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252963 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253229 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253358 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253378 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253512 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253547 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253740 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253743 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253758 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253882 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253916 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253944 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254061 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254139 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254260 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254280 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254607 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254759 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254849 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254880 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254904 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.255020 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.259743 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:47.7597138 +0000 UTC m=+74.014611690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.255289 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.255330 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.255510 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.255569 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.255366 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.256530 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.259845 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:47.75982446 +0000 UTC m=+74.014722349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.259924 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246537 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.257584 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.257503 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.257297 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.258336 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.258464 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.258757 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.259327 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.259359 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.257681 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.260088 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.260217 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.260234 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.260326 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.260342 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.260802 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.262428 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.264806 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.264828 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.264839 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.264861 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.264874 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:47.764865249 +0000 UTC m=+74.019763138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.265172 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.265177 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.265222 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.265232 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.265260 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:47.765252312 +0000 UTC m=+74.020150191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.265363 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.265596 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.265885 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.266017 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.266712 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.266891 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.267857 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.267916 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.267997 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268071 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268080 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268236 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.269286 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268279 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268456 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268604 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268767 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268777 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268845 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268878 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.275364 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.269860 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268236 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268885 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.269014 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.269065 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.269340 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.270574 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.270671 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.270867 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.275753 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.275886 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.275930 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.275952 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.275983 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.276122 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.276182 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.276377 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.277546 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.277869 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.278018 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.278142 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.278281 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.278399 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.278420 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.278686 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.279035 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.280230 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.280259 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.280263 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.280324 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.283051 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.288492 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.290691 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.297334 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.298797 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.315281 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.315317 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.315331 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.315351 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.315363 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.319323 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353626 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/99873383-15b6-42ee-a65f-7917294d2e02-rootfs\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353665 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-kubelet\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353687 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-netns\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353694 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/99873383-15b6-42ee-a65f-7917294d2e02-rootfs\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353749 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-etc-kubernetes\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353710 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-etc-kubernetes\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353838 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353752 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-kubelet\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353885 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqn66\" (UniqueName: \"kubernetes.io/projected/d41077f5-9f66-4be5-bb1a-e0f5b2b078e0-kube-api-access-fqn66\") pod \"node-resolver-7xb47\" (UID: \"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\") " pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353918 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58nsm\" (UniqueName: \"kubernetes.io/projected/99873383-15b6-42ee-a65f-7917294d2e02-kube-api-access-58nsm\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353937 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353953 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5fd36c79-e84e-49aa-97b9-616563193cd2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354010 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-config\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354035 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-k8s-cni-cncf-io\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354058 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-cni-bin\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354081 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-cni-multus\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354123 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354147 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-env-overrides\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354148 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-k8s-cni-cncf-io\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354174 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2lkr\" (UniqueName: \"kubernetes.io/projected/5fd36c79-e84e-49aa-97b9-616563193cd2-kube-api-access-v2lkr\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354199 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e883c29-520e-4b1f-b49c-3df10450d467-cni-binary-copy\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354223 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wszn\" (UniqueName: \"kubernetes.io/projected/8e883c29-520e-4b1f-b49c-3df10450d467-kube-api-access-2wszn\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354246 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99873383-15b6-42ee-a65f-7917294d2e02-mcd-auth-proxy-config\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354265 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354271 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-cnibin\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354319 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-cnibin\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354323 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-os-release\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354368 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-os-release\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354376 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354408 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-cni-bin\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354412 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6c845e62-37a1-473c-a4d0-a354594903bc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354452 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdjf\" (UniqueName: \"kubernetes.io/projected/6c845e62-37a1-473c-a4d0-a354594903bc-kube-api-access-9zdjf\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354518 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-cni-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354539 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354561 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-slash\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354581 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-script-lib\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354605 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5fd36c79-e84e-49aa-97b9-616563193cd2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354613 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-netns\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354625 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99873383-15b6-42ee-a65f-7917294d2e02-proxy-tls\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354766 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53ffac75-0989-4945-915d-4aacec270cdb-host\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354792 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c845e62-37a1-473c-a4d0-a354594903bc-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354822 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-systemd-units\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354835 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53ffac75-0989-4945-915d-4aacec270cdb-host\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354842 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h98t5\" (UniqueName: \"kubernetes.io/projected/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-kube-api-access-h98t5\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354891 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-slash\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354900 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-netns\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354923 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsr4f\" (UniqueName: \"kubernetes.io/projected/53ffac75-0989-4945-915d-4aacec270cdb-kube-api-access-qsr4f\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354944 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-systemd\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354976 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-multus-certs\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354999 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-system-cni-dir\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355020 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5fd36c79-e84e-49aa-97b9-616563193cd2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355041 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355041 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-systemd-units\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355064 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-var-lib-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355078 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-systemd\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355086 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-ovn\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355104 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-netns\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355107 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355129 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355150 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-cnibin\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355166 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-hostroot\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355181 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e883c29-520e-4b1f-b49c-3df10450d467-multus-daemon-config\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355210 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-log-socket\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355225 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-os-release\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355244 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-socket-dir-parent\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355252 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-multus-certs\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355260 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-kubelet\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355279 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-system-cni-dir\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355283 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-bin\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355306 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-bin\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355312 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-node-log\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355330 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovn-node-metrics-cert\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355346 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355352 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d41077f5-9f66-4be5-bb1a-e0f5b2b078e0-hosts-file\") pod \"node-resolver-7xb47\" (UID: \"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\") " pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355355 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6c845e62-37a1-473c-a4d0-a354594903bc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355370 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64vnf\" (UniqueName: \"kubernetes.io/projected/bd6597a3-f861-4126-933e-d6134c8bd4b5-kube-api-access-64vnf\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355384 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-cnibin\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355391 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53ffac75-0989-4945-915d-4aacec270cdb-serviceca\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355413 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-etc-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355430 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-netd\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355447 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-system-cni-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355455 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-hostroot\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355464 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-conf-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355593 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355604 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355614 4883 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355623 4883 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355632 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355641 4883 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355650 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355658 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355668 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355678 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355687 4883 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355696 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355708 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355717 4883 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355726 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355735 4883 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355745 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355754 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355763 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355765 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-script-lib\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355793 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-conf-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355331 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355772 4883 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355806 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c845e62-37a1-473c-a4d0-a354594903bc-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355817 4883 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355847 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-socket-dir-parent\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355866 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355874 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-kubelet\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355881 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355894 4883 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355892 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-etc-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355898 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5fd36c79-e84e-49aa-97b9-616563193cd2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355920 4883 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355938 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355952 4883 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355974 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355975 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-log-socket\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355986 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355998 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356000 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-cni-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356009 4883 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356025 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d41077f5-9f66-4be5-bb1a-e0f5b2b078e0-hosts-file\") pod \"node-resolver-7xb47\" (UID: \"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\") " pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.354867 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356046 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-system-cni-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355430 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-os-release\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355908 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-netd\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356075 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-node-log\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356195 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-cni-multus\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.356220 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:47.856187862 +0000 UTC m=+74.111085741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356232 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-ovn\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356288 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-var-lib-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356629 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5fd36c79-e84e-49aa-97b9-616563193cd2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356788 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356807 4883 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356826 4883 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356840 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356852 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356869 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356882 4883 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356894 4883 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356906 4883 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356920 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356935 4883 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356950 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356974 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356989 4883 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357003 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357018 4883 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357028 4883 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357041 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357054 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357038 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e883c29-520e-4b1f-b49c-3df10450d467-multus-daemon-config\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357056 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357027 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-config\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357073 4883 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357184 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357202 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357215 4883 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357227 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357241 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357255 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357265 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357277 4883 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357291 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357312 4883 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357323 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357337 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357348 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357360 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357373 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357391 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357402 4883 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357412 4883 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357423 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357433 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357441 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357452 4883 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357461 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357488 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357499 4883 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357511 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357522 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357531 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357541 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357551 4883 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357559 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357569 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357579 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357588 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357630 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357640 4883 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357650 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357659 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357667 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357677 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357682 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357688 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357884 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99873383-15b6-42ee-a65f-7917294d2e02-mcd-auth-proxy-config\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358275 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e883c29-520e-4b1f-b49c-3df10450d467-cni-binary-copy\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358580 4883 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358605 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358618 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358631 4883 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358642 4883 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358653 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358664 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358675 4883 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358685 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358696 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358706 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358719 4883 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358730 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358745 4883 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358757 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358769 4883 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358783 4883 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358779 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5fd36c79-e84e-49aa-97b9-616563193cd2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358793 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358806 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358818 4883 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358831 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358841 4883 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358851 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358864 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358874 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358884 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358894 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358896 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-env-overrides\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358907 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358919 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358930 4883 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358941 4883 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358951 4883 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358970 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358982 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358992 4883 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359002 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359012 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359022 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359032 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359043 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359052 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359061 4883 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359070 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359082 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359091 4883 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359102 4883 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359113 4883 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359122 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359132 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359143 4883 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359152 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359154 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53ffac75-0989-4945-915d-4aacec270cdb-serviceca\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359161 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359203 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359222 4883 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359232 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359243 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359254 4883 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359266 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.366578 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99873383-15b6-42ee-a65f-7917294d2e02-proxy-tls\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.367218 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovn-node-metrics-cert\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.369012 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdjf\" (UniqueName: \"kubernetes.io/projected/6c845e62-37a1-473c-a4d0-a354594903bc-kube-api-access-9zdjf\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.370866 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqn66\" (UniqueName: \"kubernetes.io/projected/d41077f5-9f66-4be5-bb1a-e0f5b2b078e0-kube-api-access-fqn66\") pod \"node-resolver-7xb47\" (UID: \"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\") " pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.371571 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsr4f\" (UniqueName: \"kubernetes.io/projected/53ffac75-0989-4945-915d-4aacec270cdb-kube-api-access-qsr4f\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.371777 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h98t5\" (UniqueName: \"kubernetes.io/projected/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-kube-api-access-h98t5\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.372042 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2lkr\" (UniqueName: \"kubernetes.io/projected/5fd36c79-e84e-49aa-97b9-616563193cd2-kube-api-access-v2lkr\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.372140 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64vnf\" (UniqueName: \"kubernetes.io/projected/bd6597a3-f861-4126-933e-d6134c8bd4b5-kube-api-access-64vnf\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.372490 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.373075 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wszn\" (UniqueName: \"kubernetes.io/projected/8e883c29-520e-4b1f-b49c-3df10450d467-kube-api-access-2wszn\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.373816 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58nsm\" (UniqueName: \"kubernetes.io/projected/99873383-15b6-42ee-a65f-7917294d2e02-kube-api-access-58nsm\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.383259 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.383594 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.383798 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.383812 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.390081 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.392365 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.397359 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.398421 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.404605 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.407082 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.412116 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:04:47 crc kubenswrapper[4883]: set +o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 09:04:47 crc kubenswrapper[4883]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 09:04:47 crc kubenswrapper[4883]: ho_enable="--enable-hybrid-overlay" Mar 10 09:04:47 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 09:04:47 crc kubenswrapper[4883]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 09:04:47 crc kubenswrapper[4883]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 09:04:47 crc kubenswrapper[4883]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 09:04:47 crc kubenswrapper[4883]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --webhook-host=127.0.0.1 \ Mar 10 09:04:47 crc kubenswrapper[4883]: --webhook-port=9743 \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${ho_enable} \ Mar 10 09:04:47 crc kubenswrapper[4883]: --enable-interconnect \ Mar 10 09:04:47 crc kubenswrapper[4883]: --disable-approver \ Mar 10 09:04:47 crc kubenswrapper[4883]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --wait-for-kubernetes-api=200s \ Mar 10 09:04:47 crc kubenswrapper[4883]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --loglevel="${LOGLEVEL}" Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.412213 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-5fd8779fcd7ff3888091c6737bdc03e284aa3701dba472f074512d524156462a WatchSource:0}: Error finding container 5fd8779fcd7ff3888091c6737bdc03e284aa3701dba472f074512d524156462a: Status 404 returned error can't find the container with id 5fd8779fcd7ff3888091c6737bdc03e284aa3701dba472f074512d524156462a Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.414831 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:04:47 crc kubenswrapper[4883]: set +o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 09:04:47 crc kubenswrapper[4883]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 09:04:47 crc kubenswrapper[4883]: --disable-webhook \ Mar 10 09:04:47 crc kubenswrapper[4883]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --loglevel="${LOGLEVEL}" Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.414933 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.416161 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.416564 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 09:04:47 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: source /etc/kubernetes/apiserver-url.env Mar 10 09:04:47 crc kubenswrapper[4883]: else Mar 10 09:04:47 crc kubenswrapper[4883]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 09:04:47 crc kubenswrapper[4883]: exit 1 Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.416762 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.416838 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-f092ec2562855ff5d74a3df6d9eeaca7a2693347921d1cfef0225bbc53f8f421 WatchSource:0}: Error finding container f092ec2562855ff5d74a3df6d9eeaca7a2693347921d1cfef0225bbc53f8f421: Status 404 returned error can't find the container with id f092ec2562855ff5d74a3df6d9eeaca7a2693347921d1cfef0225bbc53f8f421 Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.417981 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.418010 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.418019 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.418037 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.418048 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.418198 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.421719 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.426541 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.427716 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.428783 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.431707 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.432469 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 10 09:04:47 crc kubenswrapper[4883]: set -euo pipefail Mar 10 09:04:47 crc kubenswrapper[4883]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 10 09:04:47 crc kubenswrapper[4883]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 10 09:04:47 crc kubenswrapper[4883]: # As the secret mount is optional we must wait for the files to be present. Mar 10 09:04:47 crc kubenswrapper[4883]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 10 09:04:47 crc kubenswrapper[4883]: TS=$(date +%s) Mar 10 09:04:47 crc kubenswrapper[4883]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 10 09:04:47 crc kubenswrapper[4883]: HAS_LOGGED_INFO=0 Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: log_missing_certs(){ Mar 10 09:04:47 crc kubenswrapper[4883]: CUR_TS=$(date +%s) Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 10 09:04:47 crc kubenswrapper[4883]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 10 09:04:47 crc kubenswrapper[4883]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 10 09:04:47 crc kubenswrapper[4883]: HAS_LOGGED_INFO=1 Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: } Mar 10 09:04:47 crc kubenswrapper[4883]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 10 09:04:47 crc kubenswrapper[4883]: log_missing_certs Mar 10 09:04:47 crc kubenswrapper[4883]: sleep 5 Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 10 09:04:47 crc kubenswrapper[4883]: exec /usr/bin/kube-rbac-proxy \ Mar 10 09:04:47 crc kubenswrapper[4883]: --logtostderr \ Mar 10 09:04:47 crc kubenswrapper[4883]: --secure-listen-address=:9108 \ Mar 10 09:04:47 crc kubenswrapper[4883]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 10 09:04:47 crc kubenswrapper[4883]: --upstream=http://127.0.0.1:29108/ \ Mar 10 09:04:47 crc kubenswrapper[4883]: --tls-private-key-file=${TLS_PK} \ Mar 10 09:04:47 crc kubenswrapper[4883]: --tls-cert-file=${TLS_CERT} Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-x7sm9_openshift-ovn-kubernetes(5fd36c79-e84e-49aa-97b9-616563193cd2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.432991 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.435338 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:04:47 crc kubenswrapper[4883]: set +o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v4_join_subnet_opt= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v6_join_subnet_opt= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v4_transit_switch_subnet_opt= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v6_transit_switch_subnet_opt= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: dns_name_resolver_enabled_flag= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "false" == "true" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: persistent_ips_enabled_flag= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "true" == "true" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: # This is needed so that converting clusters from GA to TP Mar 10 09:04:47 crc kubenswrapper[4883]: # will rollout control plane pods as well Mar 10 09:04:47 crc kubenswrapper[4883]: network_segmentation_enabled_flag= Mar 10 09:04:47 crc kubenswrapper[4883]: multi_network_enabled_flag= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "true" == "true" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: multi_network_enabled_flag="--enable-multi-network" Mar 10 09:04:47 crc kubenswrapper[4883]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 10 09:04:47 crc kubenswrapper[4883]: exec /usr/bin/ovnkube \ Mar 10 09:04:47 crc kubenswrapper[4883]: --enable-interconnect \ Mar 10 09:04:47 crc kubenswrapper[4883]: --init-cluster-manager "${K8S_NODE}" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 10 09:04:47 crc kubenswrapper[4883]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --metrics-bind-address "127.0.0.1:29108" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --metrics-enable-pprof \ Mar 10 09:04:47 crc kubenswrapper[4883]: --metrics-enable-config-duration \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${ovn_v4_join_subnet_opt} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${ovn_v6_join_subnet_opt} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${dns_name_resolver_enabled_flag} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${persistent_ips_enabled_flag} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${multi_network_enabled_flag} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${network_segmentation_enabled_flag} Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-x7sm9_openshift-ovn-kubernetes(5fd36c79-e84e-49aa-97b9-616563193cd2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.436864 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" podUID="5fd36c79-e84e-49aa-97b9-616563193cd2" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.438021 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.440268 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc928c48_1df8_4c31_986e_eba2aa7a1c0b.slice/crio-f3b20c822aee97242cfd38a934b97331876c32d69d19440940706ed8c35c887b WatchSource:0}: Error finding container f3b20c822aee97242cfd38a934b97331876c32d69d19440940706ed8c35c887b: Status 404 returned error can't find the container with id f3b20c822aee97242cfd38a934b97331876c32d69d19440940706ed8c35c887b Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.442175 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 10 09:04:47 crc kubenswrapper[4883]: apiVersion: v1 Mar 10 09:04:47 crc kubenswrapper[4883]: clusters: Mar 10 09:04:47 crc kubenswrapper[4883]: - cluster: Mar 10 09:04:47 crc kubenswrapper[4883]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 10 09:04:47 crc kubenswrapper[4883]: server: https://api-int.crc.testing:6443 Mar 10 09:04:47 crc kubenswrapper[4883]: name: default-cluster Mar 10 09:04:47 crc kubenswrapper[4883]: contexts: Mar 10 09:04:47 crc kubenswrapper[4883]: - context: Mar 10 09:04:47 crc kubenswrapper[4883]: cluster: default-cluster Mar 10 09:04:47 crc kubenswrapper[4883]: namespace: default Mar 10 09:04:47 crc kubenswrapper[4883]: user: default-auth Mar 10 09:04:47 crc kubenswrapper[4883]: name: default-context Mar 10 09:04:47 crc kubenswrapper[4883]: current-context: default-context Mar 10 09:04:47 crc kubenswrapper[4883]: kind: Config Mar 10 09:04:47 crc kubenswrapper[4883]: preferences: {} Mar 10 09:04:47 crc kubenswrapper[4883]: users: Mar 10 09:04:47 crc kubenswrapper[4883]: - name: default-auth Mar 10 09:04:47 crc kubenswrapper[4883]: user: Mar 10 09:04:47 crc kubenswrapper[4883]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 09:04:47 crc kubenswrapper[4883]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 09:04:47 crc kubenswrapper[4883]: EOF Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h98t5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.443381 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.444396 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.444818 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.447285 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c845e62_37a1_473c_a4d0_a354594903bc.slice/crio-64d12c99c8f963e10751c69bc204dad6e6e3c7e21a25f2f3dda2ce8f8b5d812f WatchSource:0}: Error finding container 64d12c99c8f963e10751c69bc204dad6e6e3c7e21a25f2f3dda2ce8f8b5d812f: Status 404 returned error can't find the container with id 64d12c99c8f963e10751c69bc204dad6e6e3c7e21a25f2f3dda2ce8f8b5d812f Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.448139 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.448646 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53ffac75_0989_4945_915d_4aacec270cdb.slice/crio-78f55b0a6b2a69803ecc700c2616b78125740f0ea83c2e5c3fc997f67b2cc092 WatchSource:0}: Error finding container 78f55b0a6b2a69803ecc700c2616b78125740f0ea83c2e5c3fc997f67b2cc092: Status 404 returned error can't find the container with id 78f55b0a6b2a69803ecc700c2616b78125740f0ea83c2e5c3fc997f67b2cc092 Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.451621 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zdjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-nrzgf_openshift-multus(6c845e62-37a1-473c-a4d0-a354594903bc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.451671 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 10 09:04:47 crc kubenswrapper[4883]: while [ true ]; Mar 10 09:04:47 crc kubenswrapper[4883]: do Mar 10 09:04:47 crc kubenswrapper[4883]: for f in $(ls /tmp/serviceca); do Mar 10 09:04:47 crc kubenswrapper[4883]: echo $f Mar 10 09:04:47 crc kubenswrapper[4883]: ca_file_path="/tmp/serviceca/${f}" Mar 10 09:04:47 crc kubenswrapper[4883]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 10 09:04:47 crc kubenswrapper[4883]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 10 09:04:47 crc kubenswrapper[4883]: if [ -e "${reg_dir_path}" ]; then Mar 10 09:04:47 crc kubenswrapper[4883]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 10 09:04:47 crc kubenswrapper[4883]: else Mar 10 09:04:47 crc kubenswrapper[4883]: mkdir $reg_dir_path Mar 10 09:04:47 crc kubenswrapper[4883]: cp $ca_file_path $reg_dir_path/ca.crt Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: for d in $(ls /etc/docker/certs.d); do Mar 10 09:04:47 crc kubenswrapper[4883]: echo $d Mar 10 09:04:47 crc kubenswrapper[4883]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 10 09:04:47 crc kubenswrapper[4883]: reg_conf_path="/tmp/serviceca/${dp}" Mar 10 09:04:47 crc kubenswrapper[4883]: if [ ! -e "${reg_conf_path}" ]; then Mar 10 09:04:47 crc kubenswrapper[4883]: rm -rf /etc/docker/certs.d/$d Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: sleep 60 & wait ${!} Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsr4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-vvbjw_openshift-image-registry(53ffac75-0989-4945-915d-4aacec270cdb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.452905 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" podUID="6c845e62-37a1-473c-a4d0-a354594903bc" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.453412 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.453623 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-vvbjw" podUID="53ffac75-0989-4945-915d-4aacec270cdb" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.459780 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd41077f5_9f66_4be5_bb1a_e0f5b2b078e0.slice/crio-96f58adf2a741bc4cec321b2a3f65b1f75163f5b5e28eb2df89c426b3e6c8fe7 WatchSource:0}: Error finding container 96f58adf2a741bc4cec321b2a3f65b1f75163f5b5e28eb2df89c426b3e6c8fe7: Status 404 returned error can't find the container with id 96f58adf2a741bc4cec321b2a3f65b1f75163f5b5e28eb2df89c426b3e6c8fe7 Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.459766 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.460695 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e883c29_520e_4b1f_b49c_3df10450d467.slice/crio-61743c07d965f6f35c6e99cfd906e13f0cbea719515486ae16105f8f6e775f1a WatchSource:0}: Error finding container 61743c07d965f6f35c6e99cfd906e13f0cbea719515486ae16105f8f6e775f1a: Status 404 returned error can't find the container with id 61743c07d965f6f35c6e99cfd906e13f0cbea719515486ae16105f8f6e775f1a Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.461911 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 10 09:04:47 crc kubenswrapper[4883]: set -uo pipefail Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 10 09:04:47 crc kubenswrapper[4883]: HOSTS_FILE="/etc/hosts" Mar 10 09:04:47 crc kubenswrapper[4883]: TEMP_FILE="/etc/hosts.tmp" Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: # Make a temporary file with the old hosts file's attributes. Mar 10 09:04:47 crc kubenswrapper[4883]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 10 09:04:47 crc kubenswrapper[4883]: echo "Failed to preserve hosts file. Exiting." Mar 10 09:04:47 crc kubenswrapper[4883]: exit 1 Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: while true; do Mar 10 09:04:47 crc kubenswrapper[4883]: declare -A svc_ips Mar 10 09:04:47 crc kubenswrapper[4883]: for svc in "${services[@]}"; do Mar 10 09:04:47 crc kubenswrapper[4883]: # Fetch service IP from cluster dns if present. We make several tries Mar 10 09:04:47 crc kubenswrapper[4883]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 10 09:04:47 crc kubenswrapper[4883]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 10 09:04:47 crc kubenswrapper[4883]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 10 09:04:47 crc kubenswrapper[4883]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:04:47 crc kubenswrapper[4883]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:04:47 crc kubenswrapper[4883]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:04:47 crc kubenswrapper[4883]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 10 09:04:47 crc kubenswrapper[4883]: for i in ${!cmds[*]} Mar 10 09:04:47 crc kubenswrapper[4883]: do Mar 10 09:04:47 crc kubenswrapper[4883]: ips=($(eval "${cmds[i]}")) Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: svc_ips["${svc}"]="${ips[@]}" Mar 10 09:04:47 crc kubenswrapper[4883]: break Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: # Update /etc/hosts only if we get valid service IPs Mar 10 09:04:47 crc kubenswrapper[4883]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 10 09:04:47 crc kubenswrapper[4883]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 10 09:04:47 crc kubenswrapper[4883]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 10 09:04:47 crc kubenswrapper[4883]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 10 09:04:47 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:04:47 crc kubenswrapper[4883]: continue Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: # Append resolver entries for services Mar 10 09:04:47 crc kubenswrapper[4883]: rc=0 Mar 10 09:04:47 crc kubenswrapper[4883]: for svc in "${!svc_ips[@]}"; do Mar 10 09:04:47 crc kubenswrapper[4883]: for ip in ${svc_ips[${svc}]}; do Mar 10 09:04:47 crc kubenswrapper[4883]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ $rc -ne 0 ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:04:47 crc kubenswrapper[4883]: continue Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 10 09:04:47 crc kubenswrapper[4883]: # Replace /etc/hosts with our modified version if needed Mar 10 09:04:47 crc kubenswrapper[4883]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 10 09:04:47 crc kubenswrapper[4883]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:04:47 crc kubenswrapper[4883]: unset svc_ips Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqn66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-7xb47_openshift-dns(d41077f5-9f66-4be5-bb1a-e0f5b2b078e0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.463226 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-7xb47" podUID="d41077f5-9f66-4be5-bb1a-e0f5b2b078e0" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.466335 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 10 09:04:47 crc kubenswrapper[4883]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 10 09:04:47 crc kubenswrapper[4883]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wszn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-p898z_openshift-multus(8e883c29-520e-4b1f-b49c-3df10450d467): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.466604 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.467932 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99873383_15b6_42ee_a65f_7917294d2e02.slice/crio-33a804d063d73483c9020989e32442f359166c9020493396af4fcc6f7f1b75a3 WatchSource:0}: Error finding container 33a804d063d73483c9020989e32442f359166c9020493396af4fcc6f7f1b75a3: Status 404 returned error can't find the container with id 33a804d063d73483c9020989e32442f359166c9020493396af4fcc6f7f1b75a3 Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.467928 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-p898z" podUID="8e883c29-520e-4b1f-b49c-3df10450d467" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.469977 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58nsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.471869 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58nsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.473114 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.473241 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.479942 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.487240 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.493864 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.520581 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.520623 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.520634 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.520651 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.520660 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.623780 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.623829 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.623840 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.623860 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.623873 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.726967 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.727020 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.727034 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.727058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.727071 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.763000 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.763104 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.763169 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:04:48.763142835 +0000 UTC m=+75.018040724 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.763219 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.763232 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.763277 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:48.763268733 +0000 UTC m=+75.018166622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.763359 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.763390 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:48.763383039 +0000 UTC m=+75.018280928 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.829390 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.829431 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.829447 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.829466 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.829490 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.864222 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.864270 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.864294 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864423 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864444 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864463 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864510 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:48.864467043 +0000 UTC m=+75.119364932 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864536 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864575 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:48.864566492 +0000 UTC m=+75.119464381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864593 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864631 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864650 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864727 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:48.864705024 +0000 UTC m=+75.119602923 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.933755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.933816 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.933830 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.933859 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.933873 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.036267 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.036340 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.036351 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.036370 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.036384 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.083387 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.083946 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.085289 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.085893 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.086852 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.087361 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.087937 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.089062 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.089706 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.090587 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.091074 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.092089 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.092605 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.093121 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.094041 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.094611 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.095533 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.095958 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.096529 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.097564 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.098034 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.098986 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.099462 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.100338 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.100819 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.101412 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.102018 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.102458 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.103036 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.103512 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.103937 4883 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.104039 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.105210 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.105710 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.106137 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.107126 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.107721 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.108234 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.111197 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.111785 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.112532 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.113061 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.114579 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.115246 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.115784 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.116343 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.116891 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.117817 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.118290 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.118970 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.119416 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.120129 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.120672 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.121186 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.138401 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.138451 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.138463 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.138503 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.138514 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.172238 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.172269 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.172277 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.172291 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.172319 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.182084 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.184985 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.185030 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.185041 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.185055 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.185065 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.193343 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.196047 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.196076 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.196087 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.196096 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.196106 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.204717 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.210112 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.210150 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.210160 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.210176 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.210187 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.217349 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.220588 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.220632 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.220648 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.220667 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.220680 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.228272 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.228408 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.240722 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.240769 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.240780 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.240793 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.240806 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.324556 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" event={"ID":"5fd36c79-e84e-49aa-97b9-616563193cd2","Type":"ContainerStarted","Data":"71e009dc87367e50e041c8e8374a3628343780d0090eaf365ffc8a14120a7616"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.325707 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerStarted","Data":"61743c07d965f6f35c6e99cfd906e13f0cbea719515486ae16105f8f6e775f1a"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.327520 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 10 09:04:48 crc kubenswrapper[4883]: set -euo pipefail Mar 10 09:04:48 crc kubenswrapper[4883]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 10 09:04:48 crc kubenswrapper[4883]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 10 09:04:48 crc kubenswrapper[4883]: # As the secret mount is optional we must wait for the files to be present. Mar 10 09:04:48 crc kubenswrapper[4883]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 10 09:04:48 crc kubenswrapper[4883]: TS=$(date +%s) Mar 10 09:04:48 crc kubenswrapper[4883]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 10 09:04:48 crc kubenswrapper[4883]: HAS_LOGGED_INFO=0 Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: log_missing_certs(){ Mar 10 09:04:48 crc kubenswrapper[4883]: CUR_TS=$(date +%s) Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 10 09:04:48 crc kubenswrapper[4883]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 10 09:04:48 crc kubenswrapper[4883]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 10 09:04:48 crc kubenswrapper[4883]: HAS_LOGGED_INFO=1 Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: } Mar 10 09:04:48 crc kubenswrapper[4883]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 10 09:04:48 crc kubenswrapper[4883]: log_missing_certs Mar 10 09:04:48 crc kubenswrapper[4883]: sleep 5 Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 10 09:04:48 crc kubenswrapper[4883]: exec /usr/bin/kube-rbac-proxy \ Mar 10 09:04:48 crc kubenswrapper[4883]: --logtostderr \ Mar 10 09:04:48 crc kubenswrapper[4883]: --secure-listen-address=:9108 \ Mar 10 09:04:48 crc kubenswrapper[4883]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 10 09:04:48 crc kubenswrapper[4883]: --upstream=http://127.0.0.1:29108/ \ Mar 10 09:04:48 crc kubenswrapper[4883]: --tls-private-key-file=${TLS_PK} \ Mar 10 09:04:48 crc kubenswrapper[4883]: --tls-cert-file=${TLS_CERT} Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-x7sm9_openshift-ovn-kubernetes(5fd36c79-e84e-49aa-97b9-616563193cd2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.328368 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 10 09:04:48 crc kubenswrapper[4883]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 10 09:04:48 crc kubenswrapper[4883]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wszn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-p898z_openshift-multus(8e883c29-520e-4b1f-b49c-3df10450d467): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.328617 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5fd8779fcd7ff3888091c6737bdc03e284aa3701dba472f074512d524156462a"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.329504 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-p898z" podUID="8e883c29-520e-4b1f-b49c-3df10450d467" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.329748 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:04:48 crc kubenswrapper[4883]: set +o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v4_join_subnet_opt= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v6_join_subnet_opt= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v4_transit_switch_subnet_opt= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v6_transit_switch_subnet_opt= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: dns_name_resolver_enabled_flag= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "false" == "true" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: persistent_ips_enabled_flag= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "true" == "true" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: # This is needed so that converting clusters from GA to TP Mar 10 09:04:48 crc kubenswrapper[4883]: # will rollout control plane pods as well Mar 10 09:04:48 crc kubenswrapper[4883]: network_segmentation_enabled_flag= Mar 10 09:04:48 crc kubenswrapper[4883]: multi_network_enabled_flag= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "true" == "true" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: multi_network_enabled_flag="--enable-multi-network" Mar 10 09:04:48 crc kubenswrapper[4883]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 10 09:04:48 crc kubenswrapper[4883]: exec /usr/bin/ovnkube \ Mar 10 09:04:48 crc kubenswrapper[4883]: --enable-interconnect \ Mar 10 09:04:48 crc kubenswrapper[4883]: --init-cluster-manager "${K8S_NODE}" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 10 09:04:48 crc kubenswrapper[4883]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --metrics-bind-address "127.0.0.1:29108" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --metrics-enable-pprof \ Mar 10 09:04:48 crc kubenswrapper[4883]: --metrics-enable-config-duration \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${ovn_v4_join_subnet_opt} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${ovn_v6_join_subnet_opt} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${dns_name_resolver_enabled_flag} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${persistent_ips_enabled_flag} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${multi_network_enabled_flag} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${network_segmentation_enabled_flag} Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-x7sm9_openshift-ovn-kubernetes(5fd36c79-e84e-49aa-97b9-616563193cd2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.330203 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4b1c424b6315ca81459f7e78a4734f9c1c18842d33c51f3b8914a2bc431288d4"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.330868 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" podUID="5fd36c79-e84e-49aa-97b9-616563193cd2" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.331381 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 09:04:48 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: source /etc/kubernetes/apiserver-url.env Mar 10 09:04:48 crc kubenswrapper[4883]: else Mar 10 09:04:48 crc kubenswrapper[4883]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 09:04:48 crc kubenswrapper[4883]: exit 1 Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.331786 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"33a804d063d73483c9020989e32442f359166c9020493396af4fcc6f7f1b75a3"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.332124 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:04:48 crc kubenswrapper[4883]: set +o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 09:04:48 crc kubenswrapper[4883]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 09:04:48 crc kubenswrapper[4883]: ho_enable="--enable-hybrid-overlay" Mar 10 09:04:48 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 09:04:48 crc kubenswrapper[4883]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 09:04:48 crc kubenswrapper[4883]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 09:04:48 crc kubenswrapper[4883]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 09:04:48 crc kubenswrapper[4883]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --webhook-host=127.0.0.1 \ Mar 10 09:04:48 crc kubenswrapper[4883]: --webhook-port=9743 \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${ho_enable} \ Mar 10 09:04:48 crc kubenswrapper[4883]: --enable-interconnect \ Mar 10 09:04:48 crc kubenswrapper[4883]: --disable-approver \ Mar 10 09:04:48 crc kubenswrapper[4883]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --wait-for-kubernetes-api=200s \ Mar 10 09:04:48 crc kubenswrapper[4883]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --loglevel="${LOGLEVEL}" Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.332495 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.332732 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58nsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.332839 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vvbjw" event={"ID":"53ffac75-0989-4945-915d-4aacec270cdb","Type":"ContainerStarted","Data":"78f55b0a6b2a69803ecc700c2616b78125740f0ea83c2e5c3fc997f67b2cc092"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.333936 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f092ec2562855ff5d74a3df6d9eeaca7a2693347921d1cfef0225bbc53f8f421"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.333975 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:04:48 crc kubenswrapper[4883]: set +o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 09:04:48 crc kubenswrapper[4883]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 09:04:48 crc kubenswrapper[4883]: --disable-webhook \ Mar 10 09:04:48 crc kubenswrapper[4883]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --loglevel="${LOGLEVEL}" Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.334190 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 10 09:04:48 crc kubenswrapper[4883]: while [ true ]; Mar 10 09:04:48 crc kubenswrapper[4883]: do Mar 10 09:04:48 crc kubenswrapper[4883]: for f in $(ls /tmp/serviceca); do Mar 10 09:04:48 crc kubenswrapper[4883]: echo $f Mar 10 09:04:48 crc kubenswrapper[4883]: ca_file_path="/tmp/serviceca/${f}" Mar 10 09:04:48 crc kubenswrapper[4883]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 10 09:04:48 crc kubenswrapper[4883]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 10 09:04:48 crc kubenswrapper[4883]: if [ -e "${reg_dir_path}" ]; then Mar 10 09:04:48 crc kubenswrapper[4883]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 10 09:04:48 crc kubenswrapper[4883]: else Mar 10 09:04:48 crc kubenswrapper[4883]: mkdir $reg_dir_path Mar 10 09:04:48 crc kubenswrapper[4883]: cp $ca_file_path $reg_dir_path/ca.crt Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: for d in $(ls /etc/docker/certs.d); do Mar 10 09:04:48 crc kubenswrapper[4883]: echo $d Mar 10 09:04:48 crc kubenswrapper[4883]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 10 09:04:48 crc kubenswrapper[4883]: reg_conf_path="/tmp/serviceca/${dp}" Mar 10 09:04:48 crc kubenswrapper[4883]: if [ ! -e "${reg_conf_path}" ]; then Mar 10 09:04:48 crc kubenswrapper[4883]: rm -rf /etc/docker/certs.d/$d Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: sleep 60 & wait ${!} Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsr4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-vvbjw_openshift-image-registry(53ffac75-0989-4945-915d-4aacec270cdb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.334989 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerStarted","Data":"64d12c99c8f963e10751c69bc204dad6e6e3c7e21a25f2f3dda2ce8f8b5d812f"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.334990 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58nsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.335183 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.335239 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.335342 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-vvbjw" podUID="53ffac75-0989-4945-915d-4aacec270cdb" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.335677 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.335880 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zdjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-nrzgf_openshift-multus(6c845e62-37a1-473c-a4d0-a354594903bc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.336649 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.336615 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.336896 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7xb47" event={"ID":"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0","Type":"ContainerStarted","Data":"96f58adf2a741bc4cec321b2a3f65b1f75163f5b5e28eb2df89c426b3e6c8fe7"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.336971 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" podUID="6c845e62-37a1-473c-a4d0-a354594903bc" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.338425 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"f3b20c822aee97242cfd38a934b97331876c32d69d19440940706ed8c35c887b"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.340101 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 10 09:04:48 crc kubenswrapper[4883]: set -uo pipefail Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 10 09:04:48 crc kubenswrapper[4883]: HOSTS_FILE="/etc/hosts" Mar 10 09:04:48 crc kubenswrapper[4883]: TEMP_FILE="/etc/hosts.tmp" Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: # Make a temporary file with the old hosts file's attributes. Mar 10 09:04:48 crc kubenswrapper[4883]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 10 09:04:48 crc kubenswrapper[4883]: echo "Failed to preserve hosts file. Exiting." Mar 10 09:04:48 crc kubenswrapper[4883]: exit 1 Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: while true; do Mar 10 09:04:48 crc kubenswrapper[4883]: declare -A svc_ips Mar 10 09:04:48 crc kubenswrapper[4883]: for svc in "${services[@]}"; do Mar 10 09:04:48 crc kubenswrapper[4883]: # Fetch service IP from cluster dns if present. We make several tries Mar 10 09:04:48 crc kubenswrapper[4883]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 10 09:04:48 crc kubenswrapper[4883]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 10 09:04:48 crc kubenswrapper[4883]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 10 09:04:48 crc kubenswrapper[4883]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:04:48 crc kubenswrapper[4883]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:04:48 crc kubenswrapper[4883]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:04:48 crc kubenswrapper[4883]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 10 09:04:48 crc kubenswrapper[4883]: for i in ${!cmds[*]} Mar 10 09:04:48 crc kubenswrapper[4883]: do Mar 10 09:04:48 crc kubenswrapper[4883]: ips=($(eval "${cmds[i]}")) Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: svc_ips["${svc}"]="${ips[@]}" Mar 10 09:04:48 crc kubenswrapper[4883]: break Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: # Update /etc/hosts only if we get valid service IPs Mar 10 09:04:48 crc kubenswrapper[4883]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 10 09:04:48 crc kubenswrapper[4883]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 10 09:04:48 crc kubenswrapper[4883]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 10 09:04:48 crc kubenswrapper[4883]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 10 09:04:48 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:04:48 crc kubenswrapper[4883]: continue Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: # Append resolver entries for services Mar 10 09:04:48 crc kubenswrapper[4883]: rc=0 Mar 10 09:04:48 crc kubenswrapper[4883]: for svc in "${!svc_ips[@]}"; do Mar 10 09:04:48 crc kubenswrapper[4883]: for ip in ${svc_ips[${svc}]}; do Mar 10 09:04:48 crc kubenswrapper[4883]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ $rc -ne 0 ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:04:48 crc kubenswrapper[4883]: continue Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 10 09:04:48 crc kubenswrapper[4883]: # Replace /etc/hosts with our modified version if needed Mar 10 09:04:48 crc kubenswrapper[4883]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 10 09:04:48 crc kubenswrapper[4883]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:04:48 crc kubenswrapper[4883]: unset svc_ips Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqn66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-7xb47_openshift-dns(d41077f5-9f66-4be5-bb1a-e0f5b2b078e0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.340330 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.340754 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.342120 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-7xb47" podUID="d41077f5-9f66-4be5-bb1a-e0f5b2b078e0" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.342497 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 10 09:04:48 crc kubenswrapper[4883]: apiVersion: v1 Mar 10 09:04:48 crc kubenswrapper[4883]: clusters: Mar 10 09:04:48 crc kubenswrapper[4883]: - cluster: Mar 10 09:04:48 crc kubenswrapper[4883]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 10 09:04:48 crc kubenswrapper[4883]: server: https://api-int.crc.testing:6443 Mar 10 09:04:48 crc kubenswrapper[4883]: name: default-cluster Mar 10 09:04:48 crc kubenswrapper[4883]: contexts: Mar 10 09:04:48 crc kubenswrapper[4883]: - context: Mar 10 09:04:48 crc kubenswrapper[4883]: cluster: default-cluster Mar 10 09:04:48 crc kubenswrapper[4883]: namespace: default Mar 10 09:04:48 crc kubenswrapper[4883]: user: default-auth Mar 10 09:04:48 crc kubenswrapper[4883]: name: default-context Mar 10 09:04:48 crc kubenswrapper[4883]: current-context: default-context Mar 10 09:04:48 crc kubenswrapper[4883]: kind: Config Mar 10 09:04:48 crc kubenswrapper[4883]: preferences: {} Mar 10 09:04:48 crc kubenswrapper[4883]: users: Mar 10 09:04:48 crc kubenswrapper[4883]: - name: default-auth Mar 10 09:04:48 crc kubenswrapper[4883]: user: Mar 10 09:04:48 crc kubenswrapper[4883]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 09:04:48 crc kubenswrapper[4883]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 09:04:48 crc kubenswrapper[4883]: EOF Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h98t5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.343783 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.344982 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.345039 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.345065 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.345079 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.345096 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.347675 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.355851 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.367734 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.377506 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.383677 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.391245 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.401754 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.414565 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.422048 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.430509 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.438842 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.445753 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.447798 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.447831 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.447841 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.447875 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.447889 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.452514 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.460211 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.467338 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.473147 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.481915 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.488038 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.496225 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.505229 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.511504 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.513729 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.519795 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.526721 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.534180 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.541142 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.550467 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.550523 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.550534 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.550550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.550580 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.553716 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.561273 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.569490 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.577309 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.653095 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.653155 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.653167 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.653191 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.653205 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.755670 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.755716 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.755726 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.755747 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.755757 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.773546 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.773710 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:04:50.773690134 +0000 UTC m=+77.028588023 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.773783 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.773813 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.773950 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.774007 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:50.774000781 +0000 UTC m=+77.028898670 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.773952 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.774136 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:50.774118174 +0000 UTC m=+77.029016052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.858416 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.858453 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.858468 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.858504 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.858520 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.874955 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.875032 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.875062 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875123 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875215 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:50.875194584 +0000 UTC m=+77.130092473 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875224 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875248 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875262 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875269 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875293 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875309 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875331 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:50.875315012 +0000 UTC m=+77.130212911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875378 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:50.875357562 +0000 UTC m=+77.130255441 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.962553 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.962593 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.962603 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.962618 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.962627 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.064858 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.064892 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.064901 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.064919 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.064929 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.079946 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.080092 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:49 crc kubenswrapper[4883]: E0310 09:04:49.080222 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.080279 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:49 crc kubenswrapper[4883]: E0310 09:04:49.080432 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.080501 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:49 crc kubenswrapper[4883]: E0310 09:04:49.080756 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:49 crc kubenswrapper[4883]: E0310 09:04:49.080799 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.166743 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.166800 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.166811 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.166825 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.166835 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.269823 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.269874 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.269884 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.269895 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.269903 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.341586 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:04:49 crc kubenswrapper[4883]: E0310 09:04:49.342062 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.371999 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.372035 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.372047 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.372065 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.372080 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.474666 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.474723 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.474736 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.474755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.474767 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.577506 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.577550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.577561 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.577577 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.577592 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.679370 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.679407 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.679418 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.679431 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.679442 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.781191 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.781239 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.781249 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.781262 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.781272 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.883575 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.883610 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.883621 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.883636 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.883646 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.922502 4883 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.985775 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.985801 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.985809 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.985823 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.985833 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.087696 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.087733 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.087744 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.087759 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.087770 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.189918 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.189948 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.189957 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.189968 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.190008 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.292032 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.292077 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.292086 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.292106 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.292117 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.394515 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.394547 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.394559 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.394572 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.394581 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.496111 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.496140 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.496150 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.496162 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.496172 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.598441 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.598508 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.598519 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.598536 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.598548 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.700447 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.700498 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.700508 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.700519 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.700528 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.792677 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.792752 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.792816 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.793566 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.793613 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:04:54.792914734 +0000 UTC m=+81.047812613 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.793641 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.793663 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:54.79365207 +0000 UTC m=+81.048549959 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.793720 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:54.793696102 +0000 UTC m=+81.048594002 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.803156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.803197 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.803207 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.803226 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.803237 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.894046 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.894106 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.894130 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894254 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894303 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894316 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894397 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:54.894376654 +0000 UTC m=+81.149274553 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894263 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894463 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894321 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894544 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:54.894530536 +0000 UTC m=+81.149428414 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894493 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894580 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:54.894575069 +0000 UTC m=+81.149472958 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.905497 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.905531 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.905542 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.905552 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.905560 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.007044 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.007100 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.007112 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.007127 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.007139 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.079405 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.079424 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.079446 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.079531 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:51 crc kubenswrapper[4883]: E0310 09:04:51.079615 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:51 crc kubenswrapper[4883]: E0310 09:04:51.079733 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:51 crc kubenswrapper[4883]: E0310 09:04:51.079836 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:51 crc kubenswrapper[4883]: E0310 09:04:51.079891 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.108825 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.108861 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.108872 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.108887 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.108895 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.211610 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.211643 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.211652 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.211663 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.211672 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.313774 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.313804 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.313815 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.313828 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.313836 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.415117 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.415152 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.415179 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.415194 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.415205 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.516874 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.516897 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.516907 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.516919 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.516928 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.618491 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.618520 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.618530 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.618542 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.618549 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.721173 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.721210 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.721219 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.721230 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.721240 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.823043 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.823084 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.823100 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.823118 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.823131 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.925014 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.925070 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.925083 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.925105 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.925127 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.027253 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.027308 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.027325 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.027344 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.027358 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.129201 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.129231 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.129241 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.129256 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.129264 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.230857 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.230898 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.230912 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.230928 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.230939 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.333757 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.333806 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.333815 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.333833 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.333846 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.435754 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.435808 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.435820 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.435843 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.435859 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.538413 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.538568 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.538634 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.538703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.538760 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.640638 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.640689 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.640700 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.640720 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.640908 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.743900 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.743951 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.743961 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.743978 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.743993 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.845708 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.845736 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.845748 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.845760 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.845771 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.948101 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.948140 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.948150 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.948165 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.948178 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.050508 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.050548 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.050561 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.050579 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.050592 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.078875 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.079008 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:53 crc kubenswrapper[4883]: E0310 09:04:53.079105 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.079165 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.079210 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:53 crc kubenswrapper[4883]: E0310 09:04:53.079220 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:53 crc kubenswrapper[4883]: E0310 09:04:53.079296 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:53 crc kubenswrapper[4883]: E0310 09:04:53.079446 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.153372 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.153399 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.153409 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.153420 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.153428 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.254981 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.255027 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.255040 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.255052 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.255062 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.356815 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.356847 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.356856 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.356871 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.356882 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.458778 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.458810 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.458821 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.458835 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.458845 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.560671 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.560708 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.560719 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.560731 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.560741 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.663124 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.663158 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.663167 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.663178 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.663186 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.765196 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.765235 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.765246 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.765258 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.765269 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.867466 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.867525 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.867539 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.867550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.867560 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.969353 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.969389 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.969398 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.969411 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.969420 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.072104 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.072147 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.072159 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.072173 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.072188 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.089901 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.097303 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.110966 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.121088 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.127313 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.135299 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.144601 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.153046 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.160702 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.172830 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.174281 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.174311 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.174321 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.174335 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.174346 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.181635 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.188696 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.194095 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.199907 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.206277 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.276262 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.276293 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.276304 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.276314 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.276323 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.378657 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.378712 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.378728 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.378748 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.378761 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.481051 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.481087 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.481097 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.481109 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.481117 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.583577 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.583622 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.583637 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.583654 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.583677 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.686535 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.686572 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.686583 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.686596 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.686604 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.788639 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.788671 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.788679 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.788690 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.788697 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.828284 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.828344 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.828367 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.828508 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:05:02.828458436 +0000 UTC m=+89.083356325 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.828518 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.828545 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.828566 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:02.828557603 +0000 UTC m=+89.083455493 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.828587 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:02.828574105 +0000 UTC m=+89.083471994 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.890231 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.890257 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.890267 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.890276 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.890284 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.929281 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.929333 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.929363 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929467 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929558 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929581 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929593 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929570 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:02.929546729 +0000 UTC m=+89.184444618 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929674 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:02.929656086 +0000 UTC m=+89.184553974 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929695 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929717 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929731 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929775 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:02.929764381 +0000 UTC m=+89.184662270 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.993320 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.993380 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.993393 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.993409 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.993420 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.079920 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.080005 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:55 crc kubenswrapper[4883]: E0310 09:04:55.080096 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.079931 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.080148 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:55 crc kubenswrapper[4883]: E0310 09:04:55.080315 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:55 crc kubenswrapper[4883]: E0310 09:04:55.080431 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:55 crc kubenswrapper[4883]: E0310 09:04:55.080639 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.091832 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.095863 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.095895 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.095904 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.095918 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.095928 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.198210 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.198247 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.198257 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.198270 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.198284 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.301739 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.301840 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.301851 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.301875 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.301889 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.404241 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.404277 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.404288 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.404307 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.404318 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.506826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.506872 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.506883 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.506904 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.506919 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.609696 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.609738 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.609750 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.609769 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.609782 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.711138 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.711192 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.711204 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.711221 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.711233 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.813633 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.813703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.813721 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.813737 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.813747 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.915220 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.915262 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.915272 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.915292 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.915307 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.018161 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.018216 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.018228 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.018246 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.018258 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.120221 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.120259 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.120295 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.120311 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.120321 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.222708 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.222762 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.222773 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.222791 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.222801 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.324606 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.324678 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.324687 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.324697 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.324705 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.426718 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.426744 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.426754 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.426783 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.426791 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.534667 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.534719 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.534734 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.534752 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.534769 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.636447 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.636510 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.636520 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.636539 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.636551 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.738712 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.738770 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.738784 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.738797 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.738806 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.840703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.840742 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.840755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.840769 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.840778 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.942415 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.942498 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.942513 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.942535 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.942545 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.044170 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.044199 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.044232 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.044246 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.044256 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.078921 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.078936 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.078948 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:57 crc kubenswrapper[4883]: E0310 09:04:57.079020 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.079047 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:57 crc kubenswrapper[4883]: E0310 09:04:57.079135 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:57 crc kubenswrapper[4883]: E0310 09:04:57.080273 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:57 crc kubenswrapper[4883]: E0310 09:04:57.080344 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.145690 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.145726 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.145734 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.145748 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.145757 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.248127 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.248171 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.248182 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.248200 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.248213 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.350135 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.350179 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.350189 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.350204 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.350216 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.451879 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.452163 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.452185 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.452201 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.452213 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.554717 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.554747 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.554757 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.554769 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.554779 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.656221 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.656268 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.656282 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.656301 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.656312 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.758681 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.758711 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.758736 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.758752 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.758761 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.860723 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.860780 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.860794 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.860813 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.860826 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.962942 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.962997 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.963145 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.963167 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.963180 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.065813 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.065868 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.065878 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.065900 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.065911 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.168655 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.168693 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.168703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.168716 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.168727 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.252178 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.252254 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.252265 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.252284 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.252298 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: E0310 09:04:58.260855 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.263745 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.263789 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.263800 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.263813 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.263824 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: E0310 09:04:58.269696 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.272584 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.272626 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.272636 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.272656 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.272666 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: E0310 09:04:58.278928 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.281317 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.281353 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.281363 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.281377 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.281386 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: E0310 09:04:58.287962 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.290378 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.290422 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.290432 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.290445 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.290455 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: E0310 09:04:58.298572 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:58 crc kubenswrapper[4883]: E0310 09:04:58.298683 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.300556 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.300579 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.300588 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.300599 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.300608 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.403061 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.403097 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.403106 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.403118 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.403127 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.505471 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.505537 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.505550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.505567 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.505580 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.607170 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.607203 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.607213 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.607227 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.607236 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.709510 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.709545 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.709556 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.709566 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.709573 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.811967 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.812011 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.812020 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.812047 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.812059 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.914013 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.914063 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.914075 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.914098 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.914111 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.016152 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.016191 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.016201 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.016217 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.016228 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.079172 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.079289 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.079607 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.079728 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.079755 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.079628 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.080014 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.080234 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.081776 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:59 crc kubenswrapper[4883]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 10 09:04:59 crc kubenswrapper[4883]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 10 09:04:59 crc kubenswrapper[4883]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wszn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-p898z_openshift-multus(8e883c29-520e-4b1f-b49c-3df10450d467): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:59 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.081867 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58nsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.082762 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:59 crc kubenswrapper[4883]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 10 09:04:59 crc kubenswrapper[4883]: apiVersion: v1 Mar 10 09:04:59 crc kubenswrapper[4883]: clusters: Mar 10 09:04:59 crc kubenswrapper[4883]: - cluster: Mar 10 09:04:59 crc kubenswrapper[4883]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 10 09:04:59 crc kubenswrapper[4883]: server: https://api-int.crc.testing:6443 Mar 10 09:04:59 crc kubenswrapper[4883]: name: default-cluster Mar 10 09:04:59 crc kubenswrapper[4883]: contexts: Mar 10 09:04:59 crc kubenswrapper[4883]: - context: Mar 10 09:04:59 crc kubenswrapper[4883]: cluster: default-cluster Mar 10 09:04:59 crc kubenswrapper[4883]: namespace: default Mar 10 09:04:59 crc kubenswrapper[4883]: user: default-auth Mar 10 09:04:59 crc kubenswrapper[4883]: name: default-context Mar 10 09:04:59 crc kubenswrapper[4883]: current-context: default-context Mar 10 09:04:59 crc kubenswrapper[4883]: kind: Config Mar 10 09:04:59 crc kubenswrapper[4883]: preferences: {} Mar 10 09:04:59 crc kubenswrapper[4883]: users: Mar 10 09:04:59 crc kubenswrapper[4883]: - name: default-auth Mar 10 09:04:59 crc kubenswrapper[4883]: user: Mar 10 09:04:59 crc kubenswrapper[4883]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 09:04:59 crc kubenswrapper[4883]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 09:04:59 crc kubenswrapper[4883]: EOF Mar 10 09:04:59 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h98t5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:59 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.083540 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-p898z" podUID="8e883c29-520e-4b1f-b49c-3df10450d467" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.084322 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58nsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.085811 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.085901 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.118530 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.118571 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.118583 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.118606 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.118621 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.221414 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.221456 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.221466 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.221503 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.221514 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.323418 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.323465 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.323493 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.323514 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.323529 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.425697 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.425731 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.425739 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.425754 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.425768 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.528236 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.528284 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.528296 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.528315 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.528326 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.630703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.630757 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.630771 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.630793 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.630805 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.733464 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.733524 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.733535 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.733549 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.733560 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.835879 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.835923 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.835935 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.835953 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.835965 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.937831 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.937858 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.937870 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.937881 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.937889 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.039548 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.039580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.039590 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.039616 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.039626 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: E0310 09:05:00.080688 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:00 crc kubenswrapper[4883]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 10 09:05:00 crc kubenswrapper[4883]: while [ true ]; Mar 10 09:05:00 crc kubenswrapper[4883]: do Mar 10 09:05:00 crc kubenswrapper[4883]: for f in $(ls /tmp/serviceca); do Mar 10 09:05:00 crc kubenswrapper[4883]: echo $f Mar 10 09:05:00 crc kubenswrapper[4883]: ca_file_path="/tmp/serviceca/${f}" Mar 10 09:05:00 crc kubenswrapper[4883]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 10 09:05:00 crc kubenswrapper[4883]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 10 09:05:00 crc kubenswrapper[4883]: if [ -e "${reg_dir_path}" ]; then Mar 10 09:05:00 crc kubenswrapper[4883]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 10 09:05:00 crc kubenswrapper[4883]: else Mar 10 09:05:00 crc kubenswrapper[4883]: mkdir $reg_dir_path Mar 10 09:05:00 crc kubenswrapper[4883]: cp $ca_file_path $reg_dir_path/ca.crt Mar 10 09:05:00 crc kubenswrapper[4883]: fi Mar 10 09:05:00 crc kubenswrapper[4883]: done Mar 10 09:05:00 crc kubenswrapper[4883]: for d in $(ls /etc/docker/certs.d); do Mar 10 09:05:00 crc kubenswrapper[4883]: echo $d Mar 10 09:05:00 crc kubenswrapper[4883]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 10 09:05:00 crc kubenswrapper[4883]: reg_conf_path="/tmp/serviceca/${dp}" Mar 10 09:05:00 crc kubenswrapper[4883]: if [ ! -e "${reg_conf_path}" ]; then Mar 10 09:05:00 crc kubenswrapper[4883]: rm -rf /etc/docker/certs.d/$d Mar 10 09:05:00 crc kubenswrapper[4883]: fi Mar 10 09:05:00 crc kubenswrapper[4883]: done Mar 10 09:05:00 crc kubenswrapper[4883]: sleep 60 & wait ${!} Mar 10 09:05:00 crc kubenswrapper[4883]: done Mar 10 09:05:00 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsr4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-vvbjw_openshift-image-registry(53ffac75-0989-4945-915d-4aacec270cdb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:00 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:00 crc kubenswrapper[4883]: E0310 09:05:00.081873 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-vvbjw" podUID="53ffac75-0989-4945-915d-4aacec270cdb" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.141703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.141752 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.141764 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.141779 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.141793 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.243424 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.243501 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.243522 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.243544 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.243558 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.345103 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.345157 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.345175 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.345193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.345207 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.447550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.447586 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.447614 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.447628 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.447638 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.549132 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.549182 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.549191 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.549208 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.549220 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.651684 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.651731 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.651743 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.651757 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.651771 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.753690 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.753723 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.753732 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.753742 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.753753 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.855845 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.855880 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.855890 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.855900 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.855908 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.958082 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.958117 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.958126 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.958138 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.958147 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.060500 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.060534 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.060543 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.060555 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.060565 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.079847 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.079871 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.080113 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.080187 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.080224 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.080332 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.080400 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.081895 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:01 crc kubenswrapper[4883]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 09:05:01 crc kubenswrapper[4883]: set -o allexport Mar 10 09:05:01 crc kubenswrapper[4883]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 09:05:01 crc kubenswrapper[4883]: source /etc/kubernetes/apiserver-url.env Mar 10 09:05:01 crc kubenswrapper[4883]: else Mar 10 09:05:01 crc kubenswrapper[4883]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 09:05:01 crc kubenswrapper[4883]: exit 1 Mar 10 09:05:01 crc kubenswrapper[4883]: fi Mar 10 09:05:01 crc kubenswrapper[4883]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 09:05:01 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:01 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.081920 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.083008 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:01 crc kubenswrapper[4883]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:05:01 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:05:01 crc kubenswrapper[4883]: set -o allexport Mar 10 09:05:01 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:05:01 crc kubenswrapper[4883]: set +o allexport Mar 10 09:05:01 crc kubenswrapper[4883]: fi Mar 10 09:05:01 crc kubenswrapper[4883]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 09:05:01 crc kubenswrapper[4883]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 09:05:01 crc kubenswrapper[4883]: ho_enable="--enable-hybrid-overlay" Mar 10 09:05:01 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 09:05:01 crc kubenswrapper[4883]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 09:05:01 crc kubenswrapper[4883]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 09:05:01 crc kubenswrapper[4883]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 09:05:01 crc kubenswrapper[4883]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 09:05:01 crc kubenswrapper[4883]: --webhook-host=127.0.0.1 \ Mar 10 09:05:01 crc kubenswrapper[4883]: --webhook-port=9743 \ Mar 10 09:05:01 crc kubenswrapper[4883]: ${ho_enable} \ Mar 10 09:05:01 crc kubenswrapper[4883]: --enable-interconnect \ Mar 10 09:05:01 crc kubenswrapper[4883]: --disable-approver \ Mar 10 09:05:01 crc kubenswrapper[4883]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 09:05:01 crc kubenswrapper[4883]: --wait-for-kubernetes-api=200s \ Mar 10 09:05:01 crc kubenswrapper[4883]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 09:05:01 crc kubenswrapper[4883]: --loglevel="${LOGLEVEL}" Mar 10 09:05:01 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:01 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.083046 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.083304 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zdjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-nrzgf_openshift-multus(6c845e62-37a1-473c-a4d0-a354594903bc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.083374 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.084444 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.084504 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" podUID="6c845e62-37a1-473c-a4d0-a354594903bc" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.085653 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:01 crc kubenswrapper[4883]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:05:01 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:05:01 crc kubenswrapper[4883]: set -o allexport Mar 10 09:05:01 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:05:01 crc kubenswrapper[4883]: set +o allexport Mar 10 09:05:01 crc kubenswrapper[4883]: fi Mar 10 09:05:01 crc kubenswrapper[4883]: Mar 10 09:05:01 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 09:05:01 crc kubenswrapper[4883]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 09:05:01 crc kubenswrapper[4883]: --disable-webhook \ Mar 10 09:05:01 crc kubenswrapper[4883]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 09:05:01 crc kubenswrapper[4883]: --loglevel="${LOGLEVEL}" Mar 10 09:05:01 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:01 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.086760 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.162417 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.162502 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.162514 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.162534 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.162547 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.264000 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.264037 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.264047 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.264070 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.264082 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.366526 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.366560 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.366569 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.366588 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.366599 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.468931 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.468973 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.468983 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.468998 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.469008 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.570889 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.571132 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.571143 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.571159 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.571172 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.674381 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.674454 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.674468 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.674517 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.674532 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.776882 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.777036 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.777116 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.777183 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.777244 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.880336 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.880376 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.880388 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.880401 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.880408 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.982647 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.982688 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.982702 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.982714 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.982724 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.082078 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:02 crc kubenswrapper[4883]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 10 09:05:02 crc kubenswrapper[4883]: set -euo pipefail Mar 10 09:05:02 crc kubenswrapper[4883]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 10 09:05:02 crc kubenswrapper[4883]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 10 09:05:02 crc kubenswrapper[4883]: # As the secret mount is optional we must wait for the files to be present. Mar 10 09:05:02 crc kubenswrapper[4883]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 10 09:05:02 crc kubenswrapper[4883]: TS=$(date +%s) Mar 10 09:05:02 crc kubenswrapper[4883]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 10 09:05:02 crc kubenswrapper[4883]: HAS_LOGGED_INFO=0 Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: log_missing_certs(){ Mar 10 09:05:02 crc kubenswrapper[4883]: CUR_TS=$(date +%s) Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 10 09:05:02 crc kubenswrapper[4883]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 10 09:05:02 crc kubenswrapper[4883]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 10 09:05:02 crc kubenswrapper[4883]: HAS_LOGGED_INFO=1 Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: } Mar 10 09:05:02 crc kubenswrapper[4883]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 10 09:05:02 crc kubenswrapper[4883]: log_missing_certs Mar 10 09:05:02 crc kubenswrapper[4883]: sleep 5 Mar 10 09:05:02 crc kubenswrapper[4883]: done Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 10 09:05:02 crc kubenswrapper[4883]: exec /usr/bin/kube-rbac-proxy \ Mar 10 09:05:02 crc kubenswrapper[4883]: --logtostderr \ Mar 10 09:05:02 crc kubenswrapper[4883]: --secure-listen-address=:9108 \ Mar 10 09:05:02 crc kubenswrapper[4883]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 10 09:05:02 crc kubenswrapper[4883]: --upstream=http://127.0.0.1:29108/ \ Mar 10 09:05:02 crc kubenswrapper[4883]: --tls-private-key-file=${TLS_PK} \ Mar 10 09:05:02 crc kubenswrapper[4883]: --tls-cert-file=${TLS_CERT} Mar 10 09:05:02 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-x7sm9_openshift-ovn-kubernetes(5fd36c79-e84e-49aa-97b9-616563193cd2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:02 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.084333 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:02 crc kubenswrapper[4883]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: set -o allexport Mar 10 09:05:02 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:05:02 crc kubenswrapper[4883]: set +o allexport Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v4_join_subnet_opt= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v6_join_subnet_opt= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v4_transit_switch_subnet_opt= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v6_transit_switch_subnet_opt= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: dns_name_resolver_enabled_flag= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "false" == "true" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: persistent_ips_enabled_flag= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "true" == "true" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: # This is needed so that converting clusters from GA to TP Mar 10 09:05:02 crc kubenswrapper[4883]: # will rollout control plane pods as well Mar 10 09:05:02 crc kubenswrapper[4883]: network_segmentation_enabled_flag= Mar 10 09:05:02 crc kubenswrapper[4883]: multi_network_enabled_flag= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "true" == "true" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: multi_network_enabled_flag="--enable-multi-network" Mar 10 09:05:02 crc kubenswrapper[4883]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 10 09:05:02 crc kubenswrapper[4883]: exec /usr/bin/ovnkube \ Mar 10 09:05:02 crc kubenswrapper[4883]: --enable-interconnect \ Mar 10 09:05:02 crc kubenswrapper[4883]: --init-cluster-manager "${K8S_NODE}" \ Mar 10 09:05:02 crc kubenswrapper[4883]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 10 09:05:02 crc kubenswrapper[4883]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 10 09:05:02 crc kubenswrapper[4883]: --metrics-bind-address "127.0.0.1:29108" \ Mar 10 09:05:02 crc kubenswrapper[4883]: --metrics-enable-pprof \ Mar 10 09:05:02 crc kubenswrapper[4883]: --metrics-enable-config-duration \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${ovn_v4_join_subnet_opt} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${ovn_v6_join_subnet_opt} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${dns_name_resolver_enabled_flag} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${persistent_ips_enabled_flag} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${multi_network_enabled_flag} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${network_segmentation_enabled_flag} Mar 10 09:05:02 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-x7sm9_openshift-ovn-kubernetes(5fd36c79-e84e-49aa-97b9-616563193cd2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:02 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.084747 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.084791 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.084809 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.084829 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.084839 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.085769 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" podUID="5fd36c79-e84e-49aa-97b9-616563193cd2" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.095712 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.187198 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.187238 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.187250 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.187263 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.187276 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.289788 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.289828 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.289838 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.289855 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.289867 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.391900 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.391940 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.391949 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.391963 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.391988 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.493734 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.493772 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.493786 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.493804 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.493818 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.596129 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.596535 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.596607 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.596673 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.596739 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.699235 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.699284 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.699295 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.699310 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.699321 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.801224 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.801258 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.801283 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.801301 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.801309 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.902838 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.903009 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:05:18.902985897 +0000 UTC m=+105.157883785 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.904049 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.904126 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.904306 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.904368 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:18.904351932 +0000 UTC m=+105.159249821 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.904434 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.904531 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:18.904460488 +0000 UTC m=+105.159358376 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.908786 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.908831 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.908844 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.908861 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.908873 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.004645 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.004685 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.004730 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004776 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004802 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004815 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004820 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004833 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004850 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004864 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004868 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:19.004853123 +0000 UTC m=+105.259751013 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004888 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:19.004881457 +0000 UTC m=+105.259779346 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004902 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:19.004896645 +0000 UTC m=+105.259794524 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.012005 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.012042 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.012053 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.012103 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.012112 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.079918 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.079934 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.079937 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.080046 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.080058 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.080365 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.080433 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.080387 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.113772 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.113809 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.113823 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.113837 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.113849 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.215835 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.215859 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.215869 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.215884 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.215892 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.318154 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.318192 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.318204 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.318217 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.318228 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.420545 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.420593 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.420606 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.420623 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.420635 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.522710 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.522748 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.522777 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.522799 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.522810 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.624779 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.624839 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.624850 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.624864 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.624884 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.726532 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.726561 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.726570 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.726584 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.726599 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.828690 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.828730 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.828740 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.828756 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.828767 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.930886 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.930947 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.930960 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.930975 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.930986 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.032580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.032611 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.032620 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.032633 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.032641 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.080550 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:05:04 crc kubenswrapper[4883]: E0310 09:05:04.080738 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:05:04 crc kubenswrapper[4883]: E0310 09:05:04.082351 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:04 crc kubenswrapper[4883]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 10 09:05:04 crc kubenswrapper[4883]: set -uo pipefail Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 10 09:05:04 crc kubenswrapper[4883]: HOSTS_FILE="/etc/hosts" Mar 10 09:05:04 crc kubenswrapper[4883]: TEMP_FILE="/etc/hosts.tmp" Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: # Make a temporary file with the old hosts file's attributes. Mar 10 09:05:04 crc kubenswrapper[4883]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 10 09:05:04 crc kubenswrapper[4883]: echo "Failed to preserve hosts file. Exiting." Mar 10 09:05:04 crc kubenswrapper[4883]: exit 1 Mar 10 09:05:04 crc kubenswrapper[4883]: fi Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: while true; do Mar 10 09:05:04 crc kubenswrapper[4883]: declare -A svc_ips Mar 10 09:05:04 crc kubenswrapper[4883]: for svc in "${services[@]}"; do Mar 10 09:05:04 crc kubenswrapper[4883]: # Fetch service IP from cluster dns if present. We make several tries Mar 10 09:05:04 crc kubenswrapper[4883]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 10 09:05:04 crc kubenswrapper[4883]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 10 09:05:04 crc kubenswrapper[4883]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 10 09:05:04 crc kubenswrapper[4883]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:05:04 crc kubenswrapper[4883]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:05:04 crc kubenswrapper[4883]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:05:04 crc kubenswrapper[4883]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 10 09:05:04 crc kubenswrapper[4883]: for i in ${!cmds[*]} Mar 10 09:05:04 crc kubenswrapper[4883]: do Mar 10 09:05:04 crc kubenswrapper[4883]: ips=($(eval "${cmds[i]}")) Mar 10 09:05:04 crc kubenswrapper[4883]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 10 09:05:04 crc kubenswrapper[4883]: svc_ips["${svc}"]="${ips[@]}" Mar 10 09:05:04 crc kubenswrapper[4883]: break Mar 10 09:05:04 crc kubenswrapper[4883]: fi Mar 10 09:05:04 crc kubenswrapper[4883]: done Mar 10 09:05:04 crc kubenswrapper[4883]: done Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: # Update /etc/hosts only if we get valid service IPs Mar 10 09:05:04 crc kubenswrapper[4883]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 10 09:05:04 crc kubenswrapper[4883]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 10 09:05:04 crc kubenswrapper[4883]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 10 09:05:04 crc kubenswrapper[4883]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 10 09:05:04 crc kubenswrapper[4883]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 10 09:05:04 crc kubenswrapper[4883]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 10 09:05:04 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:05:04 crc kubenswrapper[4883]: continue Mar 10 09:05:04 crc kubenswrapper[4883]: fi Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: # Append resolver entries for services Mar 10 09:05:04 crc kubenswrapper[4883]: rc=0 Mar 10 09:05:04 crc kubenswrapper[4883]: for svc in "${!svc_ips[@]}"; do Mar 10 09:05:04 crc kubenswrapper[4883]: for ip in ${svc_ips[${svc}]}; do Mar 10 09:05:04 crc kubenswrapper[4883]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 10 09:05:04 crc kubenswrapper[4883]: done Mar 10 09:05:04 crc kubenswrapper[4883]: done Mar 10 09:05:04 crc kubenswrapper[4883]: if [[ $rc -ne 0 ]]; then Mar 10 09:05:04 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:05:04 crc kubenswrapper[4883]: continue Mar 10 09:05:04 crc kubenswrapper[4883]: fi Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 10 09:05:04 crc kubenswrapper[4883]: # Replace /etc/hosts with our modified version if needed Mar 10 09:05:04 crc kubenswrapper[4883]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 10 09:05:04 crc kubenswrapper[4883]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 10 09:05:04 crc kubenswrapper[4883]: fi Mar 10 09:05:04 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:05:04 crc kubenswrapper[4883]: unset svc_ips Mar 10 09:05:04 crc kubenswrapper[4883]: done Mar 10 09:05:04 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqn66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-7xb47_openshift-dns(d41077f5-9f66-4be5-bb1a-e0f5b2b078e0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:04 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:04 crc kubenswrapper[4883]: E0310 09:05:04.083464 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-7xb47" podUID="d41077f5-9f66-4be5-bb1a-e0f5b2b078e0" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.090606 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.099035 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.107595 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.118943 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.126253 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.132092 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.134236 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.134291 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.134308 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.134327 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.134339 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.138121 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.146608 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.154341 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.160299 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.172059 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.180849 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.186413 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.193189 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.199513 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.205706 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.213228 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.236956 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.236991 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.237003 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.237018 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.237030 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.339224 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.339249 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.339258 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.339269 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.339278 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.440981 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.441028 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.441039 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.441058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.441070 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.542787 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.542825 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.542837 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.542848 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.542860 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.645077 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.645110 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.645119 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.645132 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.645141 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.747358 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.747404 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.747415 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.747430 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.747440 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.849826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.849872 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.849884 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.849904 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.849916 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.952210 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.952241 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.952250 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.952264 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.952273 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.054068 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.054139 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.054156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.054175 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.054191 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.079589 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.079658 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.079699 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.079828 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:05 crc kubenswrapper[4883]: E0310 09:05:05.079844 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:05 crc kubenswrapper[4883]: E0310 09:05:05.079947 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:05 crc kubenswrapper[4883]: E0310 09:05:05.080066 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:05 crc kubenswrapper[4883]: E0310 09:05:05.080201 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.156535 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.156572 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.156582 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.156596 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.156606 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.258615 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.258645 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.258655 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.258668 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.258678 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.360990 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.361159 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.361240 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.361302 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.361358 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.462980 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.463016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.463027 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.463041 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.463048 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.532810 4883 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.565046 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.565081 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.565106 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.565122 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.565135 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.667038 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.667153 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.667229 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.667295 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.667349 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.769073 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.769368 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.769378 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.769388 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.769397 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.871322 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.871745 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.871824 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.871897 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.871972 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.974197 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.974225 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.974233 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.974247 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.974256 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.075662 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.075907 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.075991 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.076075 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.076143 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.179286 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.179504 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.179571 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.179646 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.179732 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.282943 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.283175 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.283272 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.283356 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.283453 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.385665 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.385699 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.385710 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.385723 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.385733 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.488054 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.488118 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.488130 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.488149 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.488164 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.590580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.590613 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.590623 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.590639 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.590647 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.692618 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.692774 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.692831 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.692903 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.692958 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.794805 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.794841 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.794851 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.794864 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.794874 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.896719 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.896757 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.896768 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.896783 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.896793 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.998432 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.998469 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.998501 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.998517 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.998526 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.078872 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.078918 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.078922 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:07 crc kubenswrapper[4883]: E0310 09:05:07.079019 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.079048 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:07 crc kubenswrapper[4883]: E0310 09:05:07.079169 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:07 crc kubenswrapper[4883]: E0310 09:05:07.079298 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:07 crc kubenswrapper[4883]: E0310 09:05:07.079382 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.100321 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.100349 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.100360 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.100374 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.100383 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.202391 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.202580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.202645 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.202705 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.202765 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.304791 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.304844 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.304871 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.304889 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.304900 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.407794 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.407840 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.407850 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.407862 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.407871 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.510021 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.510091 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.510119 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.510132 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.510141 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.612826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.612881 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.612896 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.612918 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.612932 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.716009 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.716042 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.716051 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.716065 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.716075 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.818433 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.818550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.818564 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.818586 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.818598 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.920891 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.920930 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.920944 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.920955 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.920967 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.022988 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.023034 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.023043 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.023066 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.023078 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.125121 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.125199 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.125216 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.125246 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.125261 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.227439 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.227512 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.227528 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.227542 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.227555 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.329064 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.329115 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.329125 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.329141 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.329152 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.430594 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.430642 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.430652 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.430669 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.430683 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.529398 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.529436 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.529446 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.529457 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.529466 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: E0310 09:05:08.538653 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.541491 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.541525 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.541536 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.541549 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.541560 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: E0310 09:05:08.549946 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.553085 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.553135 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.553148 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.553165 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.553176 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: E0310 09:05:08.561039 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.564657 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.564707 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.564720 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.564739 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.564751 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: E0310 09:05:08.571817 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.574554 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.574606 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.574617 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.574629 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.574654 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: E0310 09:05:08.584915 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:08 crc kubenswrapper[4883]: E0310 09:05:08.585179 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.586618 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.586705 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.586778 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.586841 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.586898 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.689185 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.689220 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.689229 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.689242 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.689250 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.791170 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.791200 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.791211 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.791224 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.791235 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.892927 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.892958 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.892968 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.892982 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.892991 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.995020 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.995046 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.995056 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.995071 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.995079 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.079300 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.079417 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:09 crc kubenswrapper[4883]: E0310 09:05:09.079573 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.079604 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:09 crc kubenswrapper[4883]: E0310 09:05:09.079776 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:09 crc kubenswrapper[4883]: E0310 09:05:09.079919 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.079973 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:09 crc kubenswrapper[4883]: E0310 09:05:09.080074 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.096429 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.096457 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.096470 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.096500 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.096509 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.198411 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.198457 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.198469 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.198495 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.198508 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.300051 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.300085 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.300096 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.300110 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.300127 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.401521 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.401545 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.401554 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.401566 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.401575 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.503276 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.503299 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.503308 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.503319 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.503332 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.604844 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.604880 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.604890 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.604901 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.604908 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.705987 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.706008 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.706016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.706025 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.706032 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.807575 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.807632 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.807644 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.807660 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.807669 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.909736 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.909781 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.909791 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.909808 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.909819 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.012278 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.012326 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.012337 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.012355 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.012368 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.114106 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.114153 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.114164 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.114177 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.114187 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.216442 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.216505 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.216515 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.216531 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.216543 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.318725 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.318753 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.318764 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.318778 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.318785 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.421039 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.421076 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.421085 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.421098 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.421107 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.523263 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.523305 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.523314 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.523327 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.523335 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.625396 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.625430 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.625439 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.625453 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.625465 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.727355 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.727412 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.727424 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.727444 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.727452 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.829670 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.829709 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.829721 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.829732 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.829741 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.932149 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.932236 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.932269 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.932290 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.932301 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.033776 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.033816 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.033827 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.033841 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.033851 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.079274 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.079304 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.079316 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.079391 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:11 crc kubenswrapper[4883]: E0310 09:05:11.079415 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:11 crc kubenswrapper[4883]: E0310 09:05:11.079560 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:11 crc kubenswrapper[4883]: E0310 09:05:11.079836 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:11 crc kubenswrapper[4883]: E0310 09:05:11.079921 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.137904 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.137942 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.137952 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.137967 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.137979 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.240135 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.240170 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.240181 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.240196 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.240206 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.342160 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.342193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.342205 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.342221 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.342233 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.392510 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vvbjw" event={"ID":"53ffac75-0989-4945-915d-4aacec270cdb","Type":"ContainerStarted","Data":"082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.401263 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.415930 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.425913 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.432065 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.441095 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.443961 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.443991 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.444001 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.444016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.444027 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.447915 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.454766 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.462560 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.470046 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.477903 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.485884 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.497250 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.504295 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.511008 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.517013 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.522644 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.529701 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.546463 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.546510 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.546522 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.546541 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.546556 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.648446 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.648493 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.648503 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.648515 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.648526 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.750219 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.750270 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.750283 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.750304 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.750317 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.852513 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.852550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.852565 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.852577 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.852585 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.954371 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.954408 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.954417 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.954433 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.954445 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.056039 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.056080 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.056095 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.056108 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.056117 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.158916 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.158951 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.158964 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.158979 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.158992 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.261901 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.261945 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.261973 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.261994 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.262006 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.364440 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.364519 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.364532 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.364553 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.364587 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.396447 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" exitCode=0 Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.396554 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.399005 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.399055 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.401255 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.401299 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.407216 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.416200 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.426228 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.433296 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.441897 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.450169 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.457246 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.463913 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.467005 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.467046 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.467058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.467075 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.467085 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.479909 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.491895 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.502284 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.513074 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.524168 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.533592 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.542532 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.570109 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.570184 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.570196 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.570216 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.570230 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.586120 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.616163 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.628335 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.639881 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.649400 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.658792 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.667558 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.673087 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.673142 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.673154 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.673171 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.673184 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.675458 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.686630 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.707298 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.715805 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.727433 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.736071 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.746908 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.756332 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.765763 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.774923 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.775697 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.775730 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.775739 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.775755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.775765 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.790282 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.802316 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.878384 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.878719 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.878731 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.878747 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.878761 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.981193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.981217 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.981226 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.981238 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.981246 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.079891 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.079969 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.080168 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.080196 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:13 crc kubenswrapper[4883]: E0310 09:05:13.080309 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:13 crc kubenswrapper[4883]: E0310 09:05:13.080436 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:13 crc kubenswrapper[4883]: E0310 09:05:13.080534 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:13 crc kubenswrapper[4883]: E0310 09:05:13.080760 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.084789 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.084814 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.084826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.084842 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.084854 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.187679 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.187980 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.188003 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.188024 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.188035 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.289652 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.289685 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.289699 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.289716 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.289726 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.392150 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.392212 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.392228 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.392246 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.392260 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.409695 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.409738 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.409749 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.409759 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.409769 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.494680 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.494737 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.494750 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.494775 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.494789 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.597038 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.597077 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.597087 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.597101 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.597113 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.699851 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.699930 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.699947 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.699982 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.700003 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.802160 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.802209 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.802220 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.802239 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.802251 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.904540 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.904570 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.904582 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.904594 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.904603 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.006669 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.006722 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.006734 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.006753 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.006767 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.094773 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.105411 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.108548 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.108582 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.108592 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.108607 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.108617 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.116567 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.130297 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.140596 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.149620 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.159501 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.169066 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.179385 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.189778 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.199084 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.210455 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.210515 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.210531 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.210554 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.210569 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.213509 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.229364 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.245422 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.255469 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.264818 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.276235 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.313383 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.313508 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.313605 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.313686 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.313745 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.418220 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.418256 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.418269 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.418293 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.418306 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.419884 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerStarted","Data":"8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.425555 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.426921 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.430598 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.441465 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.450338 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.458543 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.466605 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.475064 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.488339 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.498910 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.507328 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.517078 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.519942 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.519980 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.519990 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.520005 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.520016 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.527748 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.537167 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.547321 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.555393 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.567538 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.584178 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.595895 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.605243 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.617043 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.623236 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.623271 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.623282 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.623299 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.623311 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.627177 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.635501 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.646905 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.657022 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.671244 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.684242 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.693058 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.703816 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.714772 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.725923 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.725959 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.725971 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.725990 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.726003 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.728074 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.735965 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.745440 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.756131 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.777137 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.786577 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.828590 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.828633 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.828643 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.828660 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.828670 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.932670 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.932715 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.932727 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.932789 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.932813 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.036343 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.036385 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.036395 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.038083 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.038140 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.078900 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.078900 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.078966 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.079027 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:15 crc kubenswrapper[4883]: E0310 09:05:15.079184 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:15 crc kubenswrapper[4883]: E0310 09:05:15.079316 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:15 crc kubenswrapper[4883]: E0310 09:05:15.079532 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:15 crc kubenswrapper[4883]: E0310 09:05:15.079561 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.141239 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.141274 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.141285 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.141297 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.141306 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.243949 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.244014 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.244030 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.244053 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.244068 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.346884 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.346925 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.346935 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.346951 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.346961 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.449444 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.449521 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.449533 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.449551 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.449564 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.552403 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.552715 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.552728 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.552743 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.552753 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.655431 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.655489 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.655500 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.655516 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.655526 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.757698 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.757763 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.757775 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.757795 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.757808 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.860039 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.860089 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.860100 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.860122 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.860133 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.961974 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.962128 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.962205 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.962275 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.962343 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.064711 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.064753 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.064766 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.064782 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.064795 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.079780 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:05:16 crc kubenswrapper[4883]: E0310 09:05:16.079963 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.167114 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.167139 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.167148 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.167174 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.167186 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.269273 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.269319 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.269329 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.269345 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.269357 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.371259 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.371300 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.371310 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.371326 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.371336 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.437931 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.439609 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7xb47" event={"ID":"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0","Type":"ContainerStarted","Data":"128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.442461 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.445055 4883 generic.go:334] "Generic (PLEG): container finished" podID="6c845e62-37a1-473c-a4d0-a354594903bc" containerID="5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd" exitCode=0 Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.445095 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerDied","Data":"5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.454792 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.468452 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.473697 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.473735 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.473745 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.473764 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.473775 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.479610 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.489027 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.499518 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.514343 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.526242 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.535615 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.546089 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.556108 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.568706 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.577305 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.577349 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.577362 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.577382 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.577397 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.577806 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.586300 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.597031 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.612821 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.623953 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.634898 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.642213 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.651892 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.661639 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.670582 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.679549 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.679596 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.679606 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.679626 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.679638 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.686328 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.697490 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.706280 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.715578 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.724662 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.735322 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.751555 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.761664 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.772570 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.782718 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.782762 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.782775 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.782793 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.782815 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.784672 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.800981 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.811280 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.820612 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.885959 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.886109 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.886198 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.886270 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.886334 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.988559 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.988590 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.988601 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.988617 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.988629 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.079266 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.079313 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:17 crc kubenswrapper[4883]: E0310 09:05:17.079380 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.079397 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:17 crc kubenswrapper[4883]: E0310 09:05:17.079536 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.079593 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:17 crc kubenswrapper[4883]: E0310 09:05:17.079682 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:17 crc kubenswrapper[4883]: E0310 09:05:17.079796 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.091063 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.091097 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.091110 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.091126 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.091138 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.193608 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.193914 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.193932 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.193953 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.193967 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.296248 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.296307 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.296321 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.296340 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.296351 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.399592 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.399637 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.399646 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.399661 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.399674 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.452685 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" event={"ID":"5fd36c79-e84e-49aa-97b9-616563193cd2","Type":"ContainerStarted","Data":"0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.453055 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" event={"ID":"5fd36c79-e84e-49aa-97b9-616563193cd2","Type":"ContainerStarted","Data":"28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.455224 4883 generic.go:334] "Generic (PLEG): container finished" podID="6c845e62-37a1-473c-a4d0-a354594903bc" containerID="471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09" exitCode=0 Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.455354 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerDied","Data":"471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.467850 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.487258 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.498776 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.502297 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.502335 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.502348 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.502365 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.502377 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.509884 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.519126 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.529648 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.540382 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.549274 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.572628 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.588192 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.597640 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.604768 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.604971 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.604980 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.604996 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.605006 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.609451 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.619666 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.629652 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.639464 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.649425 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.662279 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.673645 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.683346 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.698780 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.707591 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.707636 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.707651 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.707965 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.707998 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.712994 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.724191 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.735652 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.745365 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.754789 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.767988 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.778988 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.789503 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.802346 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.810415 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.810453 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.810465 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.810496 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.810508 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.815951 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.825033 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.834406 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.841752 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.850582 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.913411 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.913454 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.913469 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.913509 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.913521 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.016070 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.016105 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.016115 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.016129 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.016144 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.118580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.118632 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.118642 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.118661 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.118677 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.221272 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.221314 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.221324 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.221340 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.221354 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.324087 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.324142 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.324154 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.324174 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.324195 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.426620 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.426658 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.426670 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.426685 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.426697 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.463575 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.463895 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.467065 4883 generic.go:334] "Generic (PLEG): container finished" podID="6c845e62-37a1-473c-a4d0-a354594903bc" containerID="9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba" exitCode=0 Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.467163 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerDied","Data":"9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.481033 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.490981 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.493750 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.503772 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.512857 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.521628 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.528733 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.528760 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.528771 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.528784 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.528794 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.534257 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.554121 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.563512 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.572969 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.583800 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.593853 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.603464 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.614111 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.622696 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.630742 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.630781 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.630793 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.630811 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.630823 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.638774 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.652236 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.660678 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.668420 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.678799 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.690117 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.698224 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.707608 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.716688 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.725379 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.732904 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.732940 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.732955 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.732971 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.732983 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.734728 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.749348 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.759414 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.769452 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.779021 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.788408 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.789221 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.789268 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.789279 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.789294 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.789305 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.797424 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.798907 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.801631 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.801655 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.801664 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.801677 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.801687 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.805234 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.810376 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.812809 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.812829 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.812840 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.812852 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.812861 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.816641 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.824915 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.828836 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.828869 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.828883 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.828895 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.828903 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.837227 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.838647 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.841340 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.841373 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.841385 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.841397 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.841406 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.850079 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.850209 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.851526 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.851567 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.851579 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.851596 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.851608 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.953915 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.953949 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.953962 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.953975 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.953985 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.969345 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.969536 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:05:50.969517858 +0000 UTC m=+137.224415737 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.969581 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.969634 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.969765 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.969829 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.969872 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:50.969865586 +0000 UTC m=+137.224763475 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.969963 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:50.969944946 +0000 UTC m=+137.224842835 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.056804 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.057023 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.057099 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.057193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.057252 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.070709 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.070792 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.070840 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.070909 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.070937 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.070951 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.070988 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.071024 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.071047 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.071001 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:51.070989536 +0000 UTC m=+137.325887424 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.071064 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.071111 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:51.071082491 +0000 UTC m=+137.325980380 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.071236 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:51.071165469 +0000 UTC m=+137.326063357 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.079094 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.079109 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.079111 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.079131 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.079460 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.079591 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.079678 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.079813 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.159741 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.159784 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.159795 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.159812 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.159823 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.262552 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.262603 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.262613 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.262634 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.262648 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.364585 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.364617 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.364626 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.364647 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.364658 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.467463 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.468051 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.468068 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.468096 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.468116 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.471972 4883 generic.go:334] "Generic (PLEG): container finished" podID="6c845e62-37a1-473c-a4d0-a354594903bc" containerID="2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146" exitCode=0 Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.472058 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerDied","Data":"2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.473558 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/0.log" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.477527 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232" exitCode=1 Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.477580 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.478306 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.478499 4883 scope.go:117] "RemoveContainer" containerID="13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.489366 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.507351 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.507613 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.523214 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.538530 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.551083 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.561075 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.570749 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.570787 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.570798 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.570820 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.570835 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.572370 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.583638 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.595008 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.606486 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.618584 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.634509 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.644842 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.653330 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.661569 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.670976 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.672951 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.672973 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.672985 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.673007 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.673020 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.682556 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.691926 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.702344 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.719266 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.730062 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.748947 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.759915 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.771456 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.775518 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.775574 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.775588 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.775609 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.775625 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.789722 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.799555 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.822048 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401661 6670 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401995 6670 obj_retry.go:551] Creating *factory.egressNode crc took: 1.772885ms\\\\nI0310 09:05:19.402027 6670 factory.go:1336] Added *v1.Node event handler 7\\\\nI0310 09:05:19.402065 6670 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0310 09:05:19.402516 6670 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:05:19.402601 6670 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:05:19.402639 6670 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:05:19.402671 6670 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:05:19.402735 6670 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.834341 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.848898 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.871891 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.877572 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.877608 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.877618 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.877633 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.877643 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.884653 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.897794 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.937176 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.956726 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.979322 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.979368 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.979378 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.979396 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.979408 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.083408 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.083439 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.083450 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.083469 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.083499 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.186182 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.186222 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.186232 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.186246 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.186257 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.288542 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.288584 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.288594 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.288611 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.288621 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.390455 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.390522 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.390533 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.390552 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.390568 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.482119 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/1.log" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.482670 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/0.log" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.485229 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85" exitCode=1 Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.485326 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.485401 4883 scope.go:117] "RemoveContainer" containerID="13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.486102 4883 scope.go:117] "RemoveContainer" containerID="f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85" Mar 10 09:05:20 crc kubenswrapper[4883]: E0310 09:05:20.486323 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.489278 4883 generic.go:334] "Generic (PLEG): container finished" podID="6c845e62-37a1-473c-a4d0-a354594903bc" containerID="1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0" exitCode=0 Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.489318 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerDied","Data":"1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.492176 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.492214 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.492225 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.492241 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.492252 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.499418 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.513741 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.526358 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.535664 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.552520 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401661 6670 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401995 6670 obj_retry.go:551] Creating *factory.egressNode crc took: 1.772885ms\\\\nI0310 09:05:19.402027 6670 factory.go:1336] Added *v1.Node event handler 7\\\\nI0310 09:05:19.402065 6670 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0310 09:05:19.402516 6670 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:05:19.402601 6670 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:05:19.402639 6670 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:05:19.402671 6670 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:05:19.402735 6670 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.563805 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.571349 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.583389 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.592405 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.594562 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.594596 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.594608 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.594625 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.594637 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.602288 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.612414 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.622690 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.632419 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.644013 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.662443 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.674776 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.683965 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.694438 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.697525 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.697564 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.697576 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.697601 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.697616 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.704059 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.713058 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.731894 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401661 6670 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401995 6670 obj_retry.go:551] Creating *factory.egressNode crc took: 1.772885ms\\\\nI0310 09:05:19.402027 6670 factory.go:1336] Added *v1.Node event handler 7\\\\nI0310 09:05:19.402065 6670 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0310 09:05:19.402516 6670 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:05:19.402601 6670 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:05:19.402639 6670 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:05:19.402671 6670 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:05:19.402735 6670 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.770436 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.799701 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.799737 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.799748 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.799765 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.799775 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.806355 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.849005 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.887944 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.902502 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.902557 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.902570 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.902593 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.902606 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.929114 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.969004 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.004336 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.004370 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.004381 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.004397 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.004408 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.009541 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.048129 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.079892 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.079995 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.079923 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.080213 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:21 crc kubenswrapper[4883]: E0310 09:05:21.080311 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:21 crc kubenswrapper[4883]: E0310 09:05:21.080460 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:21 crc kubenswrapper[4883]: E0310 09:05:21.080610 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:21 crc kubenswrapper[4883]: E0310 09:05:21.080655 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.088929 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.107034 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.107132 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.107232 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.107307 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.107366 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.132409 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.168261 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.209618 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.209915 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.209939 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.209952 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.209970 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.209985 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.247525 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.313496 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.313533 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.313544 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.313559 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.313570 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.415838 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.415870 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.415881 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.415895 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.415904 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.498220 4883 generic.go:334] "Generic (PLEG): container finished" podID="6c845e62-37a1-473c-a4d0-a354594903bc" containerID="c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270" exitCode=0 Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.498324 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerDied","Data":"c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.500275 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/1.log" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.504297 4883 scope.go:117] "RemoveContainer" containerID="f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85" Mar 10 09:05:21 crc kubenswrapper[4883]: E0310 09:05:21.504465 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.508292 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.519436 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.519865 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.519876 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.519444 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.519893 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.519906 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.530116 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.546867 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401661 6670 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401995 6670 obj_retry.go:551] Creating *factory.egressNode crc took: 1.772885ms\\\\nI0310 09:05:19.402027 6670 factory.go:1336] Added *v1.Node event handler 7\\\\nI0310 09:05:19.402065 6670 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0310 09:05:19.402516 6670 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:05:19.402601 6670 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:05:19.402639 6670 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:05:19.402671 6670 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:05:19.402735 6670 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.557984 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.565763 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.574319 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.582397 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.607072 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.622684 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.622718 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.622731 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.622755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.622771 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.648884 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.688330 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.726379 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.726504 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.726565 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.726623 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.726879 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.727844 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.768042 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.812408 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.829580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.829619 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.829631 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.829650 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.829665 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.848434 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.886980 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.928144 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.931802 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.931863 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.931876 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.931922 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.931939 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.968555 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.007151 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.034148 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.034189 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.034200 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.034228 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.034244 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.051677 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.091359 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.129183 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.136690 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.136731 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.136745 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.136760 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.136775 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.168515 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.208457 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.240077 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.240119 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.240129 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.240148 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.240161 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.247461 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.289176 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.329052 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.342071 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.342106 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.342117 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.342135 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.342149 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.369848 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.408586 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.443981 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.444016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.444027 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.444046 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.444062 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.452144 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.488276 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.510638 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerStarted","Data":"b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.527941 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.545614 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.545658 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.545671 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.545691 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.545705 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.567685 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.610027 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.647262 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.647296 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.647308 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.647323 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.647334 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.651897 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.690915 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.726036 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.749745 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.749780 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.749789 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.749802 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.749813 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.767569 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.807006 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.846739 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.852467 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.852591 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.852654 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.852740 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.852801 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.887086 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.928249 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.956224 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.956288 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.956304 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.956328 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.956348 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.968362 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.008914 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.053309 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.058996 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.059032 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.059041 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.059059 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.059071 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.079118 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.079126 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.079148 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.079177 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:23 crc kubenswrapper[4883]: E0310 09:05:23.079337 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:23 crc kubenswrapper[4883]: E0310 09:05:23.079509 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:23 crc kubenswrapper[4883]: E0310 09:05:23.079601 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:23 crc kubenswrapper[4883]: E0310 09:05:23.079665 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.087752 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.127103 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.160932 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.160969 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.160982 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.160999 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.161009 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.167375 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.209366 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.248312 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.262608 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.262639 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.262655 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.262670 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.262681 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.287885 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.365017 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.365058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.365069 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.365089 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.365102 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.467650 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.467691 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.467700 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.467713 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.467721 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.570008 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.570047 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.570058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.570074 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.570089 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.671941 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.671977 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.671987 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.672000 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.672010 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.774192 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.774245 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.774258 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.774277 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.774291 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.876683 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.876725 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.876736 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.876755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.876768 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.978292 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.978328 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.978338 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.978350 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.978358 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.080563 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.080602 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.080614 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.080631 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.080643 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.089763 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.096174 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.105772 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.114118 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.122633 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.132934 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.143492 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.151903 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.165621 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.177137 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.182255 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.182281 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.182291 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.182307 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.182322 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.187796 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.200501 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.210320 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.219132 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.228092 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.238018 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.247520 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.259284 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.284115 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.284145 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.284155 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.284171 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.284183 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.386790 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.387009 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.387074 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.387151 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.387213 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.488715 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.489139 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.489214 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.489291 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.489350 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.591567 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.591604 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.591618 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.591632 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.591642 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.694297 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.694337 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.694349 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.694365 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.694378 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.796944 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.796986 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.796998 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.797016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.797026 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.899304 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.899339 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.899352 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.899371 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.899387 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.001725 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.001755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.001768 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.001780 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.001789 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.079834 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.079855 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.079869 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:25 crc kubenswrapper[4883]: E0310 09:05:25.079952 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:25 crc kubenswrapper[4883]: E0310 09:05:25.080033 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.080041 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:25 crc kubenswrapper[4883]: E0310 09:05:25.080100 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:25 crc kubenswrapper[4883]: E0310 09:05:25.080277 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.103955 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.103993 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.104004 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.104019 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.104028 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.205703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.205758 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.205770 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.205790 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.205802 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.308198 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.308257 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.308269 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.308290 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.308306 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.410861 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.410902 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.410919 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.410935 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.410946 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.513061 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.513104 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.513113 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.513129 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.513138 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.615767 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.615798 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.615808 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.615822 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.615830 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.718129 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.718171 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.718181 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.718198 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.718207 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.820283 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.820327 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.820336 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.820356 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.820368 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.922616 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.922662 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.922674 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.922692 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.922702 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.024427 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.024457 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.024465 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.024494 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.024504 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.126377 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.126404 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.126413 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.126423 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.126432 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.227971 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.227992 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.227999 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.228009 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.228017 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.329772 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.329801 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.329811 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.329819 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.329826 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.431451 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.431552 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.431567 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.431590 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.431603 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.533121 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.533199 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.533218 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.533268 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.533281 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.635115 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.635177 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.635189 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.635205 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.635216 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.736751 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.736775 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.736784 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.736795 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.736804 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.838004 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.838031 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.838039 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.838048 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.838058 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.939850 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.939907 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.939920 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.939936 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.939947 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.041850 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.041879 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.041887 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.041899 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.041907 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.079722 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.079881 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.079885 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.079914 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:27 crc kubenswrapper[4883]: E0310 09:05:27.079996 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:27 crc kubenswrapper[4883]: E0310 09:05:27.080064 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.080089 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:05:27 crc kubenswrapper[4883]: E0310 09:05:27.080125 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:27 crc kubenswrapper[4883]: E0310 09:05:27.080212 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.143961 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.143993 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.144005 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.144021 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.144031 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.247960 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.248264 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.248275 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.248292 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.248304 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.350670 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.350714 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.350725 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.350762 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.350780 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.453009 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.453070 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.453084 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.453110 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.453123 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.527112 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.528709 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.529106 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.542306 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.554278 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.555400 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.555437 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.555450 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.555467 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.555493 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.564734 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.574356 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.582585 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.591151 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.601577 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.616688 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.627517 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.635223 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.646287 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.657169 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.658111 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.658157 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.658172 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.658193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.658206 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.666119 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.674694 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.682130 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.692298 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.701886 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.717492 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.763994 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.764046 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.764064 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.764082 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.764093 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.866822 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.866865 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.866876 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.866897 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.866907 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.968888 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.968924 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.968935 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.968949 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.968958 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.071398 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.071429 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.071438 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.071453 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.071463 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.173794 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.173829 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.173839 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.173851 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.173861 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.275860 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.275905 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.275916 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.275933 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.275944 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.378420 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.378579 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.378652 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.378716 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.378793 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.481087 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.481256 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.481317 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.481383 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.481436 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.583538 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.583892 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.583967 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.584037 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.584103 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.686068 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.686102 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.686113 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.686125 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.686137 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.788385 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.788409 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.788419 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.788430 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.788440 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.890770 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.890813 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.890826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.890838 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.890846 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.981872 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.981898 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.981909 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.981921 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.981929 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: E0310 09:05:28.992196 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.995138 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.995170 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.995180 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.995191 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.995200 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.004902 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:29Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.008041 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.008068 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.008079 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.008091 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.008099 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.017271 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:29Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.020608 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.020631 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.020641 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.020654 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.020661 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.030324 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:29Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.033277 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.033303 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.033313 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.033324 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.033335 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.042009 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:29Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.042109 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.043208 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.043234 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.043245 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.043265 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.043275 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.078853 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.078895 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.078862 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.078853 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.078957 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.079008 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.079064 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.079195 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.144963 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.144992 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.145001 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.145015 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.145024 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.252738 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.252776 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.252788 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.252803 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.252814 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.355225 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.355274 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.355287 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.355300 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.355311 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.457705 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.457740 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.457748 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.457762 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.457771 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.559514 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.559557 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.559566 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.559584 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.559599 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.661581 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.661621 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.661632 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.661643 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.661653 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.763621 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.763659 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.763669 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.763684 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.763695 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.865509 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.865544 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.865554 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.865565 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.865575 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.967518 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.967558 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.967569 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.967584 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.967595 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.069677 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.069715 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.069725 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.069737 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.069746 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.172180 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.172225 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.172234 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.172253 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.172272 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.274638 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.274686 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.274697 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.274714 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.274724 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.376997 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.377032 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.377043 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.377060 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.377070 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.479197 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.479220 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.479232 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.479244 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.479254 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.581469 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.581633 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.581693 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.581759 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.581812 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.683258 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.683304 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.683313 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.683327 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.683335 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.785743 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.785773 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.785784 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.785796 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.785804 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.887771 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.887803 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.887813 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.887828 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.887837 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.990230 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.990264 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.990283 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.990298 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.990308 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.079227 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.079330 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.079330 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:31 crc kubenswrapper[4883]: E0310 09:05:31.079432 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:31 crc kubenswrapper[4883]: E0310 09:05:31.079512 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:31 crc kubenswrapper[4883]: E0310 09:05:31.079578 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.079714 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:31 crc kubenswrapper[4883]: E0310 09:05:31.079921 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.092058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.092089 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.092098 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.092112 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.092123 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.193743 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.193769 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.193779 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.193792 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.193800 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.296226 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.296253 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.296263 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.296294 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.296305 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.399021 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.399068 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.399079 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.399094 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.399104 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.501649 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.501677 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.501688 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.501699 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.501708 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.604120 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.604148 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.604156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.604164 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.604172 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.706143 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.706180 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.706191 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.706206 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.706216 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.808678 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.808711 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.808721 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.808734 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.808743 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.910536 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.910561 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.910570 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.910580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.910588 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.012943 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.012979 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.012990 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.013002 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.013010 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.115079 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.115107 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.115118 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.115129 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.115138 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.217314 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.217356 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.217368 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.217387 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.217399 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.319088 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.319128 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.319139 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.319156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.319168 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.422134 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.422181 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.422194 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.422216 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.422229 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.524414 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.524452 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.524463 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.524498 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.524509 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.626798 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.626835 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.626846 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.626862 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.626872 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.729488 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.729523 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.729533 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.729568 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.729579 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.833265 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.833306 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.833316 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.833329 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.833339 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.935655 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.935709 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.935719 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.935739 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.935754 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.038351 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.038399 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.038409 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.038429 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.038441 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.079688 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.079721 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.079719 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.079731 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:33 crc kubenswrapper[4883]: E0310 09:05:33.079846 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:33 crc kubenswrapper[4883]: E0310 09:05:33.080023 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:33 crc kubenswrapper[4883]: E0310 09:05:33.080305 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:33 crc kubenswrapper[4883]: E0310 09:05:33.080523 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.080593 4883 scope.go:117] "RemoveContainer" containerID="f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.141188 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.141227 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.141249 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.141268 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.141290 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.243230 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.243510 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.243521 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.243535 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.243544 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.345693 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.345733 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.345746 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.345764 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.345775 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.447932 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.447968 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.447978 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.447993 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.448002 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.549594 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.549640 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.549652 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.549671 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.549687 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.550400 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/1.log" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.564131 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.564742 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.582884 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.593342 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.608550 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.619004 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.627135 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.636840 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.643984 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.652056 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.652092 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.652102 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.652115 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.652125 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.653591 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.661712 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.669170 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.684714 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.695311 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.704372 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.713777 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.722559 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.731719 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.746868 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.755794 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.755826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.755838 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.755857 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.755869 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.756875 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.857981 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.858038 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.858049 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.858071 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.858083 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.960660 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.960700 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.960708 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.960723 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.960732 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:34 crc kubenswrapper[4883]: E0310 09:05:34.061459 4883 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.090705 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.101638 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.111698 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.120793 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.138296 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.149453 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: E0310 09:05:34.151182 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.158345 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.169051 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.177497 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.187327 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.196940 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.205708 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.214777 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.224317 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.238079 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.246737 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.254854 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.262864 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.569710 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/2.log" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.570449 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/1.log" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.573626 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76" exitCode=1 Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.573672 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76"} Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.573715 4883 scope.go:117] "RemoveContainer" containerID="f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.574268 4883 scope.go:117] "RemoveContainer" containerID="bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76" Mar 10 09:05:34 crc kubenswrapper[4883]: E0310 09:05:34.574458 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.591867 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.601877 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.609946 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.618536 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.630134 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.639368 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.648708 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.657062 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.674653 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.690382 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.698159 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.707600 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.716419 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.724752 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.733245 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.742331 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.752073 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.761631 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.079808 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.079852 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:35 crc kubenswrapper[4883]: E0310 09:05:35.079917 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.079998 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:35 crc kubenswrapper[4883]: E0310 09:05:35.080088 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.080192 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:35 crc kubenswrapper[4883]: E0310 09:05:35.080343 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:35 crc kubenswrapper[4883]: E0310 09:05:35.080201 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.579121 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/2.log" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.584572 4883 scope.go:117] "RemoveContainer" containerID="bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76" Mar 10 09:05:35 crc kubenswrapper[4883]: E0310 09:05:35.584761 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.595406 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.606489 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.616912 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.625148 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.635933 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.651962 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.660679 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.667896 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.676872 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.688923 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.698712 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.707658 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.716844 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.725928 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.739677 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.749599 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.757793 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.767001 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:37 crc kubenswrapper[4883]: I0310 09:05:37.079608 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:37 crc kubenswrapper[4883]: I0310 09:05:37.079670 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:37 crc kubenswrapper[4883]: I0310 09:05:37.079621 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:37 crc kubenswrapper[4883]: E0310 09:05:37.079762 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:37 crc kubenswrapper[4883]: I0310 09:05:37.079774 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:37 crc kubenswrapper[4883]: E0310 09:05:37.079867 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:37 crc kubenswrapper[4883]: E0310 09:05:37.080091 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:37 crc kubenswrapper[4883]: E0310 09:05:37.080138 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.517272 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.528976 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.539422 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.548639 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.557978 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.566859 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.574400 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.583745 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.598700 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.608015 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.616540 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.625549 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.635797 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.642960 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.651745 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.659921 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.668386 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.676996 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.694802 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.079378 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.079439 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.079448 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.079521 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.079649 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.079539 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.079821 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.080147 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.153061 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.303133 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.303169 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.303181 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.303198 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.303210 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:39Z","lastTransitionTime":"2026-03-10T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.313685 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.317199 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.317300 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.317315 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.317338 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.317350 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:39Z","lastTransitionTime":"2026-03-10T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.326232 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.328857 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.328887 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.328898 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.328907 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.328917 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:39Z","lastTransitionTime":"2026-03-10T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.337707 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.339856 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.339885 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.339894 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.339905 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.339913 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:39Z","lastTransitionTime":"2026-03-10T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.348352 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.350882 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.350904 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.350915 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.350924 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.350932 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:39Z","lastTransitionTime":"2026-03-10T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.359268 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.359395 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:05:41 crc kubenswrapper[4883]: I0310 09:05:41.079171 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:41 crc kubenswrapper[4883]: I0310 09:05:41.079268 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:41 crc kubenswrapper[4883]: E0310 09:05:41.079328 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:41 crc kubenswrapper[4883]: I0310 09:05:41.079372 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:41 crc kubenswrapper[4883]: E0310 09:05:41.079520 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:41 crc kubenswrapper[4883]: E0310 09:05:41.079700 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:41 crc kubenswrapper[4883]: I0310 09:05:41.079761 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:41 crc kubenswrapper[4883]: E0310 09:05:41.079886 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:43 crc kubenswrapper[4883]: I0310 09:05:43.079198 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:43 crc kubenswrapper[4883]: E0310 09:05:43.079644 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:43 crc kubenswrapper[4883]: I0310 09:05:43.079265 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:43 crc kubenswrapper[4883]: E0310 09:05:43.079723 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:43 crc kubenswrapper[4883]: I0310 09:05:43.079324 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:43 crc kubenswrapper[4883]: E0310 09:05:43.079790 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:43 crc kubenswrapper[4883]: I0310 09:05:43.079216 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:43 crc kubenswrapper[4883]: E0310 09:05:43.079837 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.090704 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.098842 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.107505 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.115269 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.122927 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.131105 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.146451 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: E0310 09:05:44.153568 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.156281 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.166117 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.173950 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.182406 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.189628 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.196463 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.205576 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.218309 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.226253 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.233603 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.242426 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:45 crc kubenswrapper[4883]: I0310 09:05:45.079050 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:45 crc kubenswrapper[4883]: I0310 09:05:45.079077 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:45 crc kubenswrapper[4883]: I0310 09:05:45.079068 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:45 crc kubenswrapper[4883]: I0310 09:05:45.079056 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:45 crc kubenswrapper[4883]: E0310 09:05:45.079181 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:45 crc kubenswrapper[4883]: E0310 09:05:45.079395 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:45 crc kubenswrapper[4883]: E0310 09:05:45.079429 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:45 crc kubenswrapper[4883]: E0310 09:05:45.079512 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:47 crc kubenswrapper[4883]: I0310 09:05:47.079103 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:47 crc kubenswrapper[4883]: I0310 09:05:47.079211 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:47 crc kubenswrapper[4883]: E0310 09:05:47.079285 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:47 crc kubenswrapper[4883]: I0310 09:05:47.079303 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:47 crc kubenswrapper[4883]: I0310 09:05:47.079373 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:47 crc kubenswrapper[4883]: E0310 09:05:47.079388 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:47 crc kubenswrapper[4883]: E0310 09:05:47.079526 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:47 crc kubenswrapper[4883]: I0310 09:05:47.080262 4883 scope.go:117] "RemoveContainer" containerID="bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76" Mar 10 09:05:47 crc kubenswrapper[4883]: E0310 09:05:47.080430 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:05:47 crc kubenswrapper[4883]: E0310 09:05:47.079620 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.079872 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.079923 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.079892 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.079990 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.080148 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.080386 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.080712 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.080984 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.154449 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.645489 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.645523 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.645532 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.645546 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.645559 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:49Z","lastTransitionTime":"2026-03-10T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.656285 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.659156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.659188 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.659200 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.659235 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.659245 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:49Z","lastTransitionTime":"2026-03-10T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.668048 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.671058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.671110 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.671123 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.671139 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.671149 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:49Z","lastTransitionTime":"2026-03-10T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.679879 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.682646 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.682683 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.682695 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.682712 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.682722 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:49Z","lastTransitionTime":"2026-03-10T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.691362 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.698126 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.698171 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.698192 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.698206 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.698217 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:49Z","lastTransitionTime":"2026-03-10T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.710167 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.710290 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.049111 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.049238 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.049273 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:06:55.049242862 +0000 UTC m=+201.304140761 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.049357 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.049409 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.049556 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.049563 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:06:55.049530219 +0000 UTC m=+201.304428108 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.049698 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:06:55.049644026 +0000 UTC m=+201.304541915 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.079379 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.079465 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.079604 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.079618 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.079688 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.079833 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.079906 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.080028 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.088907 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.149805 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.149864 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.149903 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150053 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150069 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150081 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150114 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:06:55.150105202 +0000 UTC m=+201.405003092 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150053 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150168 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150198 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:06:55.150184624 +0000 UTC m=+201.405082523 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150208 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150231 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150292 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:06:55.150276108 +0000 UTC m=+201.405174007 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:53 crc kubenswrapper[4883]: I0310 09:05:53.079175 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:53 crc kubenswrapper[4883]: I0310 09:05:53.079211 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:53 crc kubenswrapper[4883]: I0310 09:05:53.079244 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:53 crc kubenswrapper[4883]: I0310 09:05:53.079279 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:53 crc kubenswrapper[4883]: E0310 09:05:53.079350 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:53 crc kubenswrapper[4883]: E0310 09:05:53.079512 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:53 crc kubenswrapper[4883]: E0310 09:05:53.079604 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:53 crc kubenswrapper[4883]: E0310 09:05:53.079651 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.094366 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.104528 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.116457 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.126126 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.133978 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.143489 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: E0310 09:05:54.155278 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.158843 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.174842 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.184205 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.193779 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.201505 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.212068 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.221439 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.229594 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.238308 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.246795 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.259257 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.270439 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.278047 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:55 crc kubenswrapper[4883]: I0310 09:05:55.078914 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:55 crc kubenswrapper[4883]: I0310 09:05:55.078972 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:55 crc kubenswrapper[4883]: I0310 09:05:55.079027 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:55 crc kubenswrapper[4883]: I0310 09:05:55.079056 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:55 crc kubenswrapper[4883]: E0310 09:05:55.079212 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:55 crc kubenswrapper[4883]: E0310 09:05:55.079314 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:55 crc kubenswrapper[4883]: E0310 09:05:55.079355 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:55 crc kubenswrapper[4883]: E0310 09:05:55.079397 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:57 crc kubenswrapper[4883]: I0310 09:05:57.079264 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:57 crc kubenswrapper[4883]: I0310 09:05:57.079313 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:57 crc kubenswrapper[4883]: E0310 09:05:57.079432 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:57 crc kubenswrapper[4883]: I0310 09:05:57.079523 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:57 crc kubenswrapper[4883]: E0310 09:05:57.079618 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:57 crc kubenswrapper[4883]: I0310 09:05:57.079715 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:57 crc kubenswrapper[4883]: E0310 09:05:57.079759 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:57 crc kubenswrapper[4883]: E0310 09:05:57.080112 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.080012 4883 scope.go:117] "RemoveContainer" containerID="bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.659194 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/2.log" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.661564 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.662083 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.676068 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.687790 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.700187 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.713396 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.721985 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.732906 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.742264 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.751406 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.760679 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.775413 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.785890 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.798389 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.808470 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.819071 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.835025 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.844241 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.855221 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.866196 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.881158 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.079035 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.079111 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.079148 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.079048 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.079281 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.079206 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.079401 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.079538 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.156385 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.666559 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/3.log" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.667087 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/2.log" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.670008 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" exitCode=1 Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.670063 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.670127 4883 scope.go:117] "RemoveContainer" containerID="bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.670977 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.671202 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.685911 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.695160 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.705553 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.714710 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.725492 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.740725 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:58Z\\\",\\\"message\\\":\\\" 7350 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-p898z after 0 failed attempt(s)\\\\nI0310 09:05:58.765778 7350 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 09:05:58.765786 7350 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-p898z\\\\nI0310 09:05:58.764586 7350 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765798 7350 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765807 7350 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0310 09:05:58.765843 7350 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF0310 09:05:58.765851 7350 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.752417 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.761083 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.770684 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.780339 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.789013 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.798415 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.807121 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.814724 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.825921 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.839569 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.848809 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.858324 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.858699 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.858758 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.858771 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.858795 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.858812 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:59Z","lastTransitionTime":"2026-03-10T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.868653 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.870231 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.873708 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.873747 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.873761 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.873782 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.873795 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:59Z","lastTransitionTime":"2026-03-10T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.883448 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.886435 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.886553 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.886610 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.886680 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.886746 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:59Z","lastTransitionTime":"2026-03-10T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.895968 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.899145 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.899236 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.899303 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.899386 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.899449 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:59Z","lastTransitionTime":"2026-03-10T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.908271 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.910815 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.910863 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.910875 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.910892 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.910904 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:59Z","lastTransitionTime":"2026-03-10T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.926297 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.926415 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.675870 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/0.log" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.675929 4883 generic.go:334] "Generic (PLEG): container finished" podID="8e883c29-520e-4b1f-b49c-3df10450d467" containerID="8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3" exitCode=1 Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.676003 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerDied","Data":"8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3"} Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.676377 4883 scope.go:117] "RemoveContainer" containerID="8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.678830 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/3.log" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.683731 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:06:00 crc kubenswrapper[4883]: E0310 09:06:00.683858 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.686994 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.696138 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.715446 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:58Z\\\",\\\"message\\\":\\\" 7350 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-p898z after 0 failed attempt(s)\\\\nI0310 09:05:58.765778 7350 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 09:05:58.765786 7350 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-p898z\\\\nI0310 09:05:58.764586 7350 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765798 7350 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765807 7350 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0310 09:05:58.765843 7350 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF0310 09:05:58.765851 7350 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.729595 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.738331 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.749715 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.759054 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.769401 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.779232 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.788103 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.797281 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.809761 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.825147 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.836793 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.846256 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.854778 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.863623 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.870437 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.879359 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"2026-03-10T09:05:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34\\\\n2026-03-10T09:05:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34 to /host/opt/cni/bin/\\\\n2026-03-10T09:05:14Z [verbose] multus-daemon started\\\\n2026-03-10T09:05:14Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:05:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.886234 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.892814 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.903359 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.916367 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.925050 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.933317 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.941510 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.948305 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.957150 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"2026-03-10T09:05:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34\\\\n2026-03-10T09:05:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34 to /host/opt/cni/bin/\\\\n2026-03-10T09:05:14Z [verbose] multus-daemon started\\\\n2026-03-10T09:05:14Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:05:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.965961 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.974656 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.983053 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.990990 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.012624 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:58Z\\\",\\\"message\\\":\\\" 7350 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-p898z after 0 failed attempt(s)\\\\nI0310 09:05:58.765778 7350 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 09:05:58.765786 7350 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-p898z\\\\nI0310 09:05:58.764586 7350 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765798 7350 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765807 7350 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0310 09:05:58.765843 7350 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF0310 09:05:58.765851 7350 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.027078 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.036788 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.049077 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.060084 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.069080 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.079580 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.079665 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.079734 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.079748 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:01 crc kubenswrapper[4883]: E0310 09:06:01.079894 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:01 crc kubenswrapper[4883]: E0310 09:06:01.080032 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:01 crc kubenswrapper[4883]: E0310 09:06:01.080138 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:01 crc kubenswrapper[4883]: E0310 09:06:01.080237 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.689160 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/0.log" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.689245 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerStarted","Data":"498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0"} Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.702301 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.712571 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.721496 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.730575 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.739524 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.747704 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.754927 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.765109 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.780319 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.789003 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.796468 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.806202 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"2026-03-10T09:05:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34\\\\n2026-03-10T09:05:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34 to /host/opt/cni/bin/\\\\n2026-03-10T09:05:14Z [verbose] multus-daemon started\\\\n2026-03-10T09:05:14Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:05:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.816995 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.825311 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.834274 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.843943 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.852498 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.860788 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.875141 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:58Z\\\",\\\"message\\\":\\\" 7350 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-p898z after 0 failed attempt(s)\\\\nI0310 09:05:58.765778 7350 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 09:05:58.765786 7350 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-p898z\\\\nI0310 09:05:58.764586 7350 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765798 7350 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765807 7350 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0310 09:05:58.765843 7350 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF0310 09:05:58.765851 7350 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:03 crc kubenswrapper[4883]: I0310 09:06:03.079571 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:03 crc kubenswrapper[4883]: I0310 09:06:03.079591 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:03 crc kubenswrapper[4883]: E0310 09:06:03.079728 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:03 crc kubenswrapper[4883]: I0310 09:06:03.079819 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:03 crc kubenswrapper[4883]: E0310 09:06:03.080022 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:03 crc kubenswrapper[4883]: E0310 09:06:03.080117 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:03 crc kubenswrapper[4883]: I0310 09:06:03.080163 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:03 crc kubenswrapper[4883]: E0310 09:06:03.080304 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.090504 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.101609 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.110422 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"2026-03-10T09:05:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34\\\\n2026-03-10T09:05:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34 to /host/opt/cni/bin/\\\\n2026-03-10T09:05:14Z [verbose] multus-daemon started\\\\n2026-03-10T09:05:14Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:05:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.120180 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.126676 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.134851 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.141923 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.149397 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: E0310 09:06:04.156909 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.160528 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.174382 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:58Z\\\",\\\"message\\\":\\\" 7350 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-p898z after 0 failed attempt(s)\\\\nI0310 09:05:58.765778 7350 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 09:05:58.765786 7350 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-p898z\\\\nI0310 09:05:58.764586 7350 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765798 7350 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765807 7350 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0310 09:05:58.765843 7350 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF0310 09:05:58.765851 7350 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.183610 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.192551 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.200534 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.208868 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.217292 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.224088 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.232244 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.240649 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.252869 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:05 crc kubenswrapper[4883]: I0310 09:06:05.079509 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:05 crc kubenswrapper[4883]: I0310 09:06:05.079551 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:05 crc kubenswrapper[4883]: E0310 09:06:05.079640 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:05 crc kubenswrapper[4883]: I0310 09:06:05.079650 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:05 crc kubenswrapper[4883]: I0310 09:06:05.079807 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:05 crc kubenswrapper[4883]: E0310 09:06:05.079852 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:05 crc kubenswrapper[4883]: E0310 09:06:05.079920 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:05 crc kubenswrapper[4883]: E0310 09:06:05.079977 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:07 crc kubenswrapper[4883]: I0310 09:06:07.079830 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:07 crc kubenswrapper[4883]: I0310 09:06:07.080357 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:07 crc kubenswrapper[4883]: E0310 09:06:07.080606 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:07 crc kubenswrapper[4883]: I0310 09:06:07.080839 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:07 crc kubenswrapper[4883]: I0310 09:06:07.081248 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:07 crc kubenswrapper[4883]: E0310 09:06:07.082452 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:07 crc kubenswrapper[4883]: E0310 09:06:07.082510 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:07 crc kubenswrapper[4883]: E0310 09:06:07.082250 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.079652 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.079679 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.079735 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.079755 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:09 crc kubenswrapper[4883]: E0310 09:06:09.079805 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:09 crc kubenswrapper[4883]: E0310 09:06:09.079932 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:09 crc kubenswrapper[4883]: E0310 09:06:09.080066 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:09 crc kubenswrapper[4883]: E0310 09:06:09.080107 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:09 crc kubenswrapper[4883]: E0310 09:06:09.157884 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.977836 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.977883 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.977894 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.977915 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.977927 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:09Z","lastTransitionTime":"2026-03-10T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:09 crc kubenswrapper[4883]: E0310 09:06:09.990205 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:09Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.993805 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.993842 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.993852 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.993870 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.993881 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:09Z","lastTransitionTime":"2026-03-10T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:10 crc kubenswrapper[4883]: E0310 09:06:10.003234 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:10Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.006118 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.006145 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.006156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.006167 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.006174 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:10Z","lastTransitionTime":"2026-03-10T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:10 crc kubenswrapper[4883]: E0310 09:06:10.015247 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:10Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.018189 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.018225 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.018236 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.018252 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.018263 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:10Z","lastTransitionTime":"2026-03-10T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:10 crc kubenswrapper[4883]: E0310 09:06:10.027724 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:10Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.030554 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.030587 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.030596 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.030613 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.030623 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:10Z","lastTransitionTime":"2026-03-10T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:10 crc kubenswrapper[4883]: E0310 09:06:10.039795 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:10Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:10 crc kubenswrapper[4883]: E0310 09:06:10.039956 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:06:11 crc kubenswrapper[4883]: I0310 09:06:11.079798 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:11 crc kubenswrapper[4883]: I0310 09:06:11.079792 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:11 crc kubenswrapper[4883]: E0310 09:06:11.079964 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:11 crc kubenswrapper[4883]: I0310 09:06:11.079994 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:11 crc kubenswrapper[4883]: E0310 09:06:11.080301 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:11 crc kubenswrapper[4883]: E0310 09:06:11.080576 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:11 crc kubenswrapper[4883]: I0310 09:06:11.080794 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:11 crc kubenswrapper[4883]: E0310 09:06:11.081192 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:13 crc kubenswrapper[4883]: I0310 09:06:13.079742 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:13 crc kubenswrapper[4883]: I0310 09:06:13.079774 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:13 crc kubenswrapper[4883]: E0310 09:06:13.079897 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:13 crc kubenswrapper[4883]: I0310 09:06:13.079930 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:13 crc kubenswrapper[4883]: I0310 09:06:13.079753 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:13 crc kubenswrapper[4883]: E0310 09:06:13.080103 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:13 crc kubenswrapper[4883]: E0310 09:06:13.080162 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:13 crc kubenswrapper[4883]: E0310 09:06:13.080262 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.080643 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:06:14 crc kubenswrapper[4883]: E0310 09:06:14.080828 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.090223 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.099599 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.109542 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.122224 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.130308 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.138187 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.145437 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.151967 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: E0310 09:06:14.158723 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.160996 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.169822 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"2026-03-10T09:05:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34\\\\n2026-03-10T09:05:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34 to /host/opt/cni/bin/\\\\n2026-03-10T09:05:14Z [verbose] multus-daemon started\\\\n2026-03-10T09:05:14Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:05:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.182369 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.189107 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.202513 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:58Z\\\",\\\"message\\\":\\\" 7350 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-p898z after 0 failed attempt(s)\\\\nI0310 09:05:58.765778 7350 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 09:05:58.765786 7350 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-p898z\\\\nI0310 09:05:58.764586 7350 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765798 7350 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765807 7350 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0310 09:05:58.765843 7350 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF0310 09:05:58.765851 7350 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.213314 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.220787 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.228801 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.236571 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.245042 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.252201 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:15 crc kubenswrapper[4883]: I0310 09:06:15.078851 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:15 crc kubenswrapper[4883]: I0310 09:06:15.078987 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:15 crc kubenswrapper[4883]: I0310 09:06:15.078985 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:15 crc kubenswrapper[4883]: E0310 09:06:15.079156 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:15 crc kubenswrapper[4883]: I0310 09:06:15.079193 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:15 crc kubenswrapper[4883]: E0310 09:06:15.079214 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:15 crc kubenswrapper[4883]: E0310 09:06:15.079282 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:15 crc kubenswrapper[4883]: E0310 09:06:15.079438 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:17 crc kubenswrapper[4883]: I0310 09:06:17.079377 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:17 crc kubenswrapper[4883]: E0310 09:06:17.079500 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:17 crc kubenswrapper[4883]: I0310 09:06:17.079548 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:17 crc kubenswrapper[4883]: E0310 09:06:17.079587 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:17 crc kubenswrapper[4883]: I0310 09:06:17.079694 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:17 crc kubenswrapper[4883]: I0310 09:06:17.079718 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:17 crc kubenswrapper[4883]: E0310 09:06:17.079831 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:17 crc kubenswrapper[4883]: E0310 09:06:17.079931 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:19 crc kubenswrapper[4883]: I0310 09:06:19.079415 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:19 crc kubenswrapper[4883]: I0310 09:06:19.079469 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:19 crc kubenswrapper[4883]: E0310 09:06:19.079578 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:19 crc kubenswrapper[4883]: I0310 09:06:19.079423 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:19 crc kubenswrapper[4883]: I0310 09:06:19.079634 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:19 crc kubenswrapper[4883]: E0310 09:06:19.079741 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:19 crc kubenswrapper[4883]: E0310 09:06:19.079826 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:19 crc kubenswrapper[4883]: E0310 09:06:19.079875 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:19 crc kubenswrapper[4883]: E0310 09:06:19.159156 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.152786 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.152831 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.152842 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.152862 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.152872 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:20Z","lastTransitionTime":"2026-03-10T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:20 crc kubenswrapper[4883]: E0310 09:06:20.164169 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.167377 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.167414 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.167424 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.167438 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.167449 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:20Z","lastTransitionTime":"2026-03-10T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:20 crc kubenswrapper[4883]: E0310 09:06:20.176198 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.178961 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.178991 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.179000 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.179025 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.179038 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:20Z","lastTransitionTime":"2026-03-10T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:20 crc kubenswrapper[4883]: E0310 09:06:20.187365 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.189891 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.189926 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.189939 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.189953 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.189965 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:20Z","lastTransitionTime":"2026-03-10T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.231124 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9"] Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.231537 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.232811 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.233223 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.233406 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.234390 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.247362 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=93.247341853 podStartE2EDuration="1m33.247341853s" podCreationTimestamp="2026-03-10 09:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.247122856 +0000 UTC m=+166.502020745" watchObservedRunningTime="2026-03-10 09:06:20.247341853 +0000 UTC m=+166.502239742" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.263436 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.2634123 podStartE2EDuration="1m18.2634123s" podCreationTimestamp="2026-03-10 09:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.263253729 +0000 UTC m=+166.518151618" watchObservedRunningTime="2026-03-10 09:06:20.2634123 +0000 UTC m=+166.518310200" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.272034 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=29.271675544 podStartE2EDuration="29.271675544s" podCreationTimestamp="2026-03-10 09:05:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.271277947 +0000 UTC m=+166.526175837" watchObservedRunningTime="2026-03-10 09:06:20.271675544 +0000 UTC m=+166.526573433" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.286091 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/16a87c79-f274-4b63-9efd-b7ab322e8567-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.286131 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16a87c79-f274-4b63-9efd-b7ab322e8567-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.286163 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a87c79-f274-4b63-9efd-b7ab322e8567-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.286189 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/16a87c79-f274-4b63-9efd-b7ab322e8567-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.286203 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16a87c79-f274-4b63-9efd-b7ab322e8567-service-ca\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.298430 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=85.298412722 podStartE2EDuration="1m25.298412722s" podCreationTimestamp="2026-03-10 09:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.298271242 +0000 UTC m=+166.553169131" watchObservedRunningTime="2026-03-10 09:06:20.298412722 +0000 UTC m=+166.553310610" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.305495 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vvbjw" podStartSLOduration=138.305464267 podStartE2EDuration="2m18.305464267s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.305096387 +0000 UTC m=+166.559994276" watchObservedRunningTime="2026-03-10 09:06:20.305464267 +0000 UTC m=+166.560362156" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.314927 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-p898z" podStartSLOduration=138.314913769 podStartE2EDuration="2m18.314913769s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.31459358 +0000 UTC m=+166.569491479" watchObservedRunningTime="2026-03-10 09:06:20.314913769 +0000 UTC m=+166.569811658" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.332204 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=56.33218832 podStartE2EDuration="56.33218832s" podCreationTimestamp="2026-03-10 09:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.331774572 +0000 UTC m=+166.586672461" watchObservedRunningTime="2026-03-10 09:06:20.33218832 +0000 UTC m=+166.587086209" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.344212 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" podStartSLOduration=138.344192305 podStartE2EDuration="2m18.344192305s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.344073349 +0000 UTC m=+166.598971238" watchObservedRunningTime="2026-03-10 09:06:20.344192305 +0000 UTC m=+166.599090194" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.382364 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" podStartSLOduration=138.382349045 podStartE2EDuration="2m18.382349045s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.381867819 +0000 UTC m=+166.636765707" watchObservedRunningTime="2026-03-10 09:06:20.382349045 +0000 UTC m=+166.637246934" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387215 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a87c79-f274-4b63-9efd-b7ab322e8567-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387269 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/16a87c79-f274-4b63-9efd-b7ab322e8567-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387289 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16a87c79-f274-4b63-9efd-b7ab322e8567-service-ca\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387357 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/16a87c79-f274-4b63-9efd-b7ab322e8567-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387379 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16a87c79-f274-4b63-9efd-b7ab322e8567-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387584 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/16a87c79-f274-4b63-9efd-b7ab322e8567-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387637 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/16a87c79-f274-4b63-9efd-b7ab322e8567-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.388287 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16a87c79-f274-4b63-9efd-b7ab322e8567-service-ca\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.391931 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a87c79-f274-4b63-9efd-b7ab322e8567-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.394723 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7xb47" podStartSLOduration=138.394713547 podStartE2EDuration="2m18.394713547s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.393961715 +0000 UTC m=+166.648859604" watchObservedRunningTime="2026-03-10 09:06:20.394713547 +0000 UTC m=+166.649611426" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.400183 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16a87c79-f274-4b63-9efd-b7ab322e8567-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.422639 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podStartSLOduration=138.422617727 podStartE2EDuration="2m18.422617727s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.422539979 +0000 UTC m=+166.677437867" watchObservedRunningTime="2026-03-10 09:06:20.422617727 +0000 UTC m=+166.677515605" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.542762 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.744179 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" event={"ID":"16a87c79-f274-4b63-9efd-b7ab322e8567","Type":"ContainerStarted","Data":"1a413818aabae19edcf2afd8ad0856cc578e90ff4b9bcb2cef1d239e938c8388"} Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.744419 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" event={"ID":"16a87c79-f274-4b63-9efd-b7ab322e8567","Type":"ContainerStarted","Data":"23873ee8e0141ab19e7238d4b2496a8877c5418d90616f1cedea30304728607a"} Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.757144 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" podStartSLOduration=138.757127639 podStartE2EDuration="2m18.757127639s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.756805225 +0000 UTC m=+167.011703114" watchObservedRunningTime="2026-03-10 09:06:20.757127639 +0000 UTC m=+167.012025529" Mar 10 09:06:21 crc kubenswrapper[4883]: I0310 09:06:21.079220 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:21 crc kubenswrapper[4883]: I0310 09:06:21.079259 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:21 crc kubenswrapper[4883]: I0310 09:06:21.079299 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:21 crc kubenswrapper[4883]: I0310 09:06:21.079282 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:21 crc kubenswrapper[4883]: E0310 09:06:21.079365 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:21 crc kubenswrapper[4883]: E0310 09:06:21.079499 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:21 crc kubenswrapper[4883]: E0310 09:06:21.079611 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:21 crc kubenswrapper[4883]: E0310 09:06:21.079688 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:21 crc kubenswrapper[4883]: I0310 09:06:21.144628 4883 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 10 09:06:21 crc kubenswrapper[4883]: I0310 09:06:21.150895 4883 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 09:06:23 crc kubenswrapper[4883]: I0310 09:06:23.079492 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:23 crc kubenswrapper[4883]: I0310 09:06:23.079559 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:23 crc kubenswrapper[4883]: I0310 09:06:23.079573 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:23 crc kubenswrapper[4883]: E0310 09:06:23.079659 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:23 crc kubenswrapper[4883]: E0310 09:06:23.079884 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:23 crc kubenswrapper[4883]: I0310 09:06:23.080104 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:23 crc kubenswrapper[4883]: E0310 09:06:23.080151 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:23 crc kubenswrapper[4883]: E0310 09:06:23.080449 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:24 crc kubenswrapper[4883]: E0310 09:06:24.160617 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:25 crc kubenswrapper[4883]: I0310 09:06:25.078973 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:25 crc kubenswrapper[4883]: I0310 09:06:25.079062 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:25 crc kubenswrapper[4883]: I0310 09:06:25.079108 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:25 crc kubenswrapper[4883]: E0310 09:06:25.079139 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:25 crc kubenswrapper[4883]: I0310 09:06:25.079206 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:25 crc kubenswrapper[4883]: E0310 09:06:25.079333 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:25 crc kubenswrapper[4883]: E0310 09:06:25.079437 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:25 crc kubenswrapper[4883]: E0310 09:06:25.079775 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:25 crc kubenswrapper[4883]: I0310 09:06:25.079927 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:06:25 crc kubenswrapper[4883]: E0310 09:06:25.080106 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:06:27 crc kubenswrapper[4883]: I0310 09:06:27.079325 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:27 crc kubenswrapper[4883]: I0310 09:06:27.079407 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:27 crc kubenswrapper[4883]: E0310 09:06:27.079444 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:27 crc kubenswrapper[4883]: I0310 09:06:27.079466 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:27 crc kubenswrapper[4883]: I0310 09:06:27.079467 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:27 crc kubenswrapper[4883]: E0310 09:06:27.079557 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:27 crc kubenswrapper[4883]: E0310 09:06:27.079649 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:27 crc kubenswrapper[4883]: E0310 09:06:27.079787 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:29 crc kubenswrapper[4883]: I0310 09:06:29.078849 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:29 crc kubenswrapper[4883]: I0310 09:06:29.078876 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:29 crc kubenswrapper[4883]: I0310 09:06:29.078905 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:29 crc kubenswrapper[4883]: E0310 09:06:29.078968 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:29 crc kubenswrapper[4883]: I0310 09:06:29.078988 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:29 crc kubenswrapper[4883]: E0310 09:06:29.079090 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:29 crc kubenswrapper[4883]: E0310 09:06:29.079232 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:29 crc kubenswrapper[4883]: E0310 09:06:29.079338 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:29 crc kubenswrapper[4883]: E0310 09:06:29.162080 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:31 crc kubenswrapper[4883]: I0310 09:06:31.079645 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:31 crc kubenswrapper[4883]: I0310 09:06:31.079739 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:31 crc kubenswrapper[4883]: E0310 09:06:31.079798 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:31 crc kubenswrapper[4883]: I0310 09:06:31.079831 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:31 crc kubenswrapper[4883]: E0310 09:06:31.080008 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:31 crc kubenswrapper[4883]: E0310 09:06:31.080126 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:31 crc kubenswrapper[4883]: I0310 09:06:31.080377 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:31 crc kubenswrapper[4883]: E0310 09:06:31.080448 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:33 crc kubenswrapper[4883]: I0310 09:06:33.079355 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:33 crc kubenswrapper[4883]: I0310 09:06:33.079425 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:33 crc kubenswrapper[4883]: I0310 09:06:33.079506 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:33 crc kubenswrapper[4883]: E0310 09:06:33.079585 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:33 crc kubenswrapper[4883]: I0310 09:06:33.079453 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:33 crc kubenswrapper[4883]: E0310 09:06:33.079669 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:33 crc kubenswrapper[4883]: E0310 09:06:33.079751 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:33 crc kubenswrapper[4883]: E0310 09:06:33.079826 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:34 crc kubenswrapper[4883]: E0310 09:06:34.162964 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:35 crc kubenswrapper[4883]: I0310 09:06:35.079351 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:35 crc kubenswrapper[4883]: I0310 09:06:35.079400 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:35 crc kubenswrapper[4883]: I0310 09:06:35.079402 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:35 crc kubenswrapper[4883]: E0310 09:06:35.079557 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:35 crc kubenswrapper[4883]: I0310 09:06:35.079581 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:35 crc kubenswrapper[4883]: E0310 09:06:35.079729 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:35 crc kubenswrapper[4883]: E0310 09:06:35.079806 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:35 crc kubenswrapper[4883]: E0310 09:06:35.079868 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:36 crc kubenswrapper[4883]: I0310 09:06:36.080607 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:06:36 crc kubenswrapper[4883]: E0310 09:06:36.080899 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:06:37 crc kubenswrapper[4883]: I0310 09:06:37.079867 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:37 crc kubenswrapper[4883]: I0310 09:06:37.079935 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:37 crc kubenswrapper[4883]: I0310 09:06:37.079895 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:37 crc kubenswrapper[4883]: I0310 09:06:37.079867 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:37 crc kubenswrapper[4883]: E0310 09:06:37.080054 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:37 crc kubenswrapper[4883]: E0310 09:06:37.080230 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:37 crc kubenswrapper[4883]: E0310 09:06:37.080270 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:37 crc kubenswrapper[4883]: E0310 09:06:37.080586 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:39 crc kubenswrapper[4883]: I0310 09:06:39.078993 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:39 crc kubenswrapper[4883]: I0310 09:06:39.079084 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:39 crc kubenswrapper[4883]: I0310 09:06:39.078988 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:39 crc kubenswrapper[4883]: I0310 09:06:39.079183 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:39 crc kubenswrapper[4883]: E0310 09:06:39.079178 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:39 crc kubenswrapper[4883]: E0310 09:06:39.079232 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:39 crc kubenswrapper[4883]: E0310 09:06:39.079277 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:39 crc kubenswrapper[4883]: E0310 09:06:39.079376 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:39 crc kubenswrapper[4883]: E0310 09:06:39.164640 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:41 crc kubenswrapper[4883]: I0310 09:06:41.079177 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:41 crc kubenswrapper[4883]: E0310 09:06:41.080043 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:41 crc kubenswrapper[4883]: I0310 09:06:41.079317 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:41 crc kubenswrapper[4883]: E0310 09:06:41.080229 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:41 crc kubenswrapper[4883]: I0310 09:06:41.079289 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:41 crc kubenswrapper[4883]: E0310 09:06:41.080425 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:41 crc kubenswrapper[4883]: I0310 09:06:41.079563 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:41 crc kubenswrapper[4883]: E0310 09:06:41.080628 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:43 crc kubenswrapper[4883]: I0310 09:06:43.079783 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:43 crc kubenswrapper[4883]: I0310 09:06:43.079976 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:43 crc kubenswrapper[4883]: I0310 09:06:43.080027 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:43 crc kubenswrapper[4883]: I0310 09:06:43.080190 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:43 crc kubenswrapper[4883]: E0310 09:06:43.080200 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:43 crc kubenswrapper[4883]: E0310 09:06:43.080324 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:43 crc kubenswrapper[4883]: E0310 09:06:43.080501 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:43 crc kubenswrapper[4883]: E0310 09:06:43.080610 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:44 crc kubenswrapper[4883]: E0310 09:06:44.165287 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:45 crc kubenswrapper[4883]: I0310 09:06:45.079795 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:45 crc kubenswrapper[4883]: E0310 09:06:45.079960 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:45 crc kubenswrapper[4883]: I0310 09:06:45.080099 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:45 crc kubenswrapper[4883]: I0310 09:06:45.080151 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:45 crc kubenswrapper[4883]: I0310 09:06:45.080324 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:45 crc kubenswrapper[4883]: E0310 09:06:45.080355 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:45 crc kubenswrapper[4883]: E0310 09:06:45.080533 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:45 crc kubenswrapper[4883]: E0310 09:06:45.080651 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:46 crc kubenswrapper[4883]: I0310 09:06:46.830003 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/1.log" Mar 10 09:06:46 crc kubenswrapper[4883]: I0310 09:06:46.831021 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/0.log" Mar 10 09:06:46 crc kubenswrapper[4883]: I0310 09:06:46.831085 4883 generic.go:334] "Generic (PLEG): container finished" podID="8e883c29-520e-4b1f-b49c-3df10450d467" containerID="498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0" exitCode=1 Mar 10 09:06:46 crc kubenswrapper[4883]: I0310 09:06:46.831124 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerDied","Data":"498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0"} Mar 10 09:06:46 crc kubenswrapper[4883]: I0310 09:06:46.831173 4883 scope.go:117] "RemoveContainer" containerID="8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3" Mar 10 09:06:46 crc kubenswrapper[4883]: I0310 09:06:46.831785 4883 scope.go:117] "RemoveContainer" containerID="498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0" Mar 10 09:06:46 crc kubenswrapper[4883]: E0310 09:06:46.832031 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-p898z_openshift-multus(8e883c29-520e-4b1f-b49c-3df10450d467)\"" pod="openshift-multus/multus-p898z" podUID="8e883c29-520e-4b1f-b49c-3df10450d467" Mar 10 09:06:47 crc kubenswrapper[4883]: I0310 09:06:47.079207 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:47 crc kubenswrapper[4883]: I0310 09:06:47.079760 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:47 crc kubenswrapper[4883]: E0310 09:06:47.079956 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:47 crc kubenswrapper[4883]: I0310 09:06:47.079981 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:47 crc kubenswrapper[4883]: E0310 09:06:47.080175 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:47 crc kubenswrapper[4883]: I0310 09:06:47.080008 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:47 crc kubenswrapper[4883]: E0310 09:06:47.080370 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:47 crc kubenswrapper[4883]: E0310 09:06:47.080175 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:47 crc kubenswrapper[4883]: I0310 09:06:47.836164 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/1.log" Mar 10 09:06:49 crc kubenswrapper[4883]: I0310 09:06:49.079865 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:49 crc kubenswrapper[4883]: I0310 09:06:49.079973 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:49 crc kubenswrapper[4883]: I0310 09:06:49.079985 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:49 crc kubenswrapper[4883]: I0310 09:06:49.080006 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:49 crc kubenswrapper[4883]: E0310 09:06:49.080134 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:49 crc kubenswrapper[4883]: E0310 09:06:49.080270 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:49 crc kubenswrapper[4883]: E0310 09:06:49.080542 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:49 crc kubenswrapper[4883]: E0310 09:06:49.080618 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:49 crc kubenswrapper[4883]: E0310 09:06:49.166640 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.079247 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.079302 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.079341 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.079253 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:51 crc kubenswrapper[4883]: E0310 09:06:51.079413 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:51 crc kubenswrapper[4883]: E0310 09:06:51.079569 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:51 crc kubenswrapper[4883]: E0310 09:06:51.079732 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:51 crc kubenswrapper[4883]: E0310 09:06:51.080027 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.080252 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.752272 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gmq5n"] Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.852427 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/3.log" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.855809 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:51 crc kubenswrapper[4883]: E0310 09:06:51.855984 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.856224 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.856589 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:06:53 crc kubenswrapper[4883]: I0310 09:06:53.079436 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:53 crc kubenswrapper[4883]: I0310 09:06:53.079506 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:53 crc kubenswrapper[4883]: I0310 09:06:53.079638 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:53 crc kubenswrapper[4883]: E0310 09:06:53.079640 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:53 crc kubenswrapper[4883]: I0310 09:06:53.079739 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:53 crc kubenswrapper[4883]: E0310 09:06:53.079934 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:53 crc kubenswrapper[4883]: E0310 09:06:53.079993 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:53 crc kubenswrapper[4883]: E0310 09:06:53.080045 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:54 crc kubenswrapper[4883]: E0310 09:06:54.167429 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.078520 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.078596 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.078650 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.078750 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:08:57.078709743 +0000 UTC m=+323.333607642 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.078788 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.078818 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.078878 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:08:57.078853193 +0000 UTC m=+323.333751082 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.078901 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:08:57.078891394 +0000 UTC m=+323.333789283 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.078944 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.078988 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.079017 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.078988 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.079094 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.079201 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.079293 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.079355 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.179315 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.179373 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.179395 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179517 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179523 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179559 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179624 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:08:57.179609232 +0000 UTC m=+323.434507120 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179569 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179681 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179535 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179736 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:08:57.179721132 +0000 UTC m=+323.434619022 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179741 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179773 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:08:57.179764654 +0000 UTC m=+323.434662533 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:06:57 crc kubenswrapper[4883]: I0310 09:06:57.079582 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:57 crc kubenswrapper[4883]: I0310 09:06:57.079667 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:57 crc kubenswrapper[4883]: I0310 09:06:57.079714 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:57 crc kubenswrapper[4883]: E0310 09:06:57.079758 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:57 crc kubenswrapper[4883]: I0310 09:06:57.079830 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:57 crc kubenswrapper[4883]: E0310 09:06:57.079939 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:57 crc kubenswrapper[4883]: E0310 09:06:57.080076 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:57 crc kubenswrapper[4883]: E0310 09:06:57.080219 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:58 crc kubenswrapper[4883]: I0310 09:06:58.079774 4883 scope.go:117] "RemoveContainer" containerID="498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0" Mar 10 09:06:58 crc kubenswrapper[4883]: I0310 09:06:58.121140 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podStartSLOduration=176.121123091 podStartE2EDuration="2m56.121123091s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:51.898222421 +0000 UTC m=+198.153120300" watchObservedRunningTime="2026-03-10 09:06:58.121123091 +0000 UTC m=+204.376020981" Mar 10 09:06:58 crc kubenswrapper[4883]: I0310 09:06:58.878468 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/1.log" Mar 10 09:06:58 crc kubenswrapper[4883]: I0310 09:06:58.878551 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerStarted","Data":"5ff25fea28b07754fa577f6bb631a6166978ad82f449fba3657e3971a580bd6d"} Mar 10 09:06:59 crc kubenswrapper[4883]: I0310 09:06:59.079797 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:59 crc kubenswrapper[4883]: I0310 09:06:59.079797 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:59 crc kubenswrapper[4883]: E0310 09:06:59.080791 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:59 crc kubenswrapper[4883]: I0310 09:06:59.079873 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:59 crc kubenswrapper[4883]: E0310 09:06:59.080954 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:59 crc kubenswrapper[4883]: I0310 09:06:59.079840 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:59 crc kubenswrapper[4883]: E0310 09:06:59.081056 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:59 crc kubenswrapper[4883]: E0310 09:06:59.081108 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:59 crc kubenswrapper[4883]: E0310 09:06:59.169218 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:07:01 crc kubenswrapper[4883]: I0310 09:07:01.079495 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:07:01 crc kubenswrapper[4883]: I0310 09:07:01.079572 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:07:01 crc kubenswrapper[4883]: I0310 09:07:01.079502 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:07:01 crc kubenswrapper[4883]: E0310 09:07:01.079651 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:07:01 crc kubenswrapper[4883]: I0310 09:07:01.079763 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:07:01 crc kubenswrapper[4883]: E0310 09:07:01.079945 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:07:01 crc kubenswrapper[4883]: E0310 09:07:01.080096 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:07:01 crc kubenswrapper[4883]: E0310 09:07:01.080265 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:07:03 crc kubenswrapper[4883]: I0310 09:07:03.079141 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:07:03 crc kubenswrapper[4883]: I0310 09:07:03.079197 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:07:03 crc kubenswrapper[4883]: I0310 09:07:03.079144 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:07:03 crc kubenswrapper[4883]: E0310 09:07:03.079293 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:07:03 crc kubenswrapper[4883]: E0310 09:07:03.079348 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:07:03 crc kubenswrapper[4883]: I0310 09:07:03.079257 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:07:03 crc kubenswrapper[4883]: E0310 09:07:03.079522 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:07:03 crc kubenswrapper[4883]: E0310 09:07:03.079519 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.079447 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.079459 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.079600 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.080086 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.082401 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.082401 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.082726 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.083060 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.083242 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.083463 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.395345 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.429684 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h5tmh"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.430407 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7986"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.430595 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.430857 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pzn5l"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.431203 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.431318 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.431425 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7clc9"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.431779 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437176 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437270 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437634 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437720 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437321 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437534 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437540 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437894 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437899 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.438127 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.438277 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.438417 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.439025 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.439349 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-42rrg"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.439823 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.440156 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.440469 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.440914 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-29pxk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.441276 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.441626 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.441641 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.441796 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.442048 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.442853 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.443291 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.443704 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.444024 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.444249 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.444659 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.444809 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.444849 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nbvf4"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.445008 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.445196 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.450439 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.450878 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.452692 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-69msk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.453219 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.454074 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.454621 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.454788 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.454971 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455001 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455006 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455138 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455202 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455215 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455227 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455339 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455617 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455743 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455856 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455900 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.459722 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.459751 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.459797 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.459810 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.459754 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.459926 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.460094 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.465731 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.466007 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.466159 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.466547 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.466925 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467058 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467180 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467261 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467405 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467450 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467643 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467789 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467801 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467957 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.468123 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.468494 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.468509 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.468660 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.469178 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.469383 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.486964 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.487282 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.487419 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.487581 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.485796 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.485945 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.487856 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.485994 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.486047 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.488006 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.486104 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.486168 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.488128 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.486904 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.488806 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.489164 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.489631 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ppkhj"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.490376 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.490392 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.491589 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.491621 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.501246 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.501357 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.501516 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.502723 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.503610 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.503865 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.503999 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jv69n"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504018 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504113 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504412 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504531 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504645 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4sznb"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504678 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504711 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504760 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.505369 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.505386 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.508561 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.509037 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.509320 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.509410 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.511585 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.511666 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.512189 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.514030 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.516570 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-42rrg"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.516605 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.517088 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.523965 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.524112 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.527065 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.527173 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.527757 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7clc9"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.527839 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.528851 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.530462 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.530559 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.533135 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7986"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.533316 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.533362 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9vv9k"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.533444 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.534141 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.536158 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.541983 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.543928 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.548331 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.550729 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.551730 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h5tmh"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.551875 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.553900 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5fqgx"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.555931 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.556001 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.556292 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.556410 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.556878 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.557579 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-29pxk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.558199 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.558949 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.559192 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.559953 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.560227 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.561533 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.563904 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.564016 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.564222 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.564592 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.564642 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.564841 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.565267 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.565742 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dh2nm"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.566748 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.567049 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.567502 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.568821 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.570295 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ndt59"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.570484 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.572465 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.572825 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552226-jp7d9"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.572924 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.573302 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.573430 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.573703 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.574001 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-69msk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.574924 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4sznb"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.576005 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ppkhj"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.577194 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nbvf4"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.579137 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pzn5l"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.580408 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.581489 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.582693 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.583853 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kzsbn"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.585039 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.586077 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.587202 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.588146 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.590185 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.592215 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.598166 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.600600 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jv69n"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.604124 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.606154 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.607346 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.608948 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.609891 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93468e41-3e48-469f-90a9-7e05e45fe141-serving-cert\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.609930 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04125307-b213-4579-8042-92284900796b-metrics-tls\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.609955 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-serving-cert\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.609980 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6acff1e-cd79-44a7-bb48-1a79857b2a97-auth-proxy-config\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610013 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fw5p\" (UniqueName: \"kubernetes.io/projected/dd173309-9e96-468f-a21c-f25c86186744-kube-api-access-8fw5p\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610036 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610062 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610082 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-oauth-serving-cert\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610108 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610196 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/040462f9-c464-41cc-8843-cac46b3da8bf-trusted-ca\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610254 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-trusted-ca-bundle\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610323 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-images\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610352 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcx47\" (UniqueName: \"kubernetes.io/projected/bed2a913-4f7d-4a64-aed8-a510280c9b6b-kube-api-access-lcx47\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610375 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610397 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-audit-dir\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610419 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04118e18-43d2-4aed-9812-aba776c0bf61-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610438 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-config\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610461 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610501 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-client\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610523 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610546 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610633 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmprt\" (UniqueName: \"kubernetes.io/projected/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-kube-api-access-vmprt\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610707 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xknbc\" (UniqueName: \"kubernetes.io/projected/93468e41-3e48-469f-90a9-7e05e45fe141-kube-api-access-xknbc\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610747 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-etcd-client\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610790 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-oauth-config\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610825 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bed2a913-4f7d-4a64-aed8-a510280c9b6b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610848 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610877 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8pnh\" (UniqueName: \"kubernetes.io/projected/04118e18-43d2-4aed-9812-aba776c0bf61-kube-api-access-d8pnh\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610896 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-client-ca\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610943 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-dir\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610969 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610994 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-serving-cert\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611025 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-config\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611046 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611047 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6acff1e-cd79-44a7-bb48-1a79857b2a97-config\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611081 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-config\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611162 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg6gc\" (UniqueName: \"kubernetes.io/projected/040462f9-c464-41cc-8843-cac46b3da8bf-kube-api-access-lg6gc\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611230 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j54tj\" (UniqueName: \"kubernetes.io/projected/09a0b780-3bf5-4607-9907-33e16ae4f098-kube-api-access-j54tj\") pod \"cluster-samples-operator-665b6dd947-stgnk\" (UID: \"09a0b780-3bf5-4607-9907-33e16ae4f098\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611258 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-encryption-config\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611295 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611318 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611337 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04118e18-43d2-4aed-9812-aba776c0bf61-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611360 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-policies\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611387 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lm8b\" (UniqueName: \"kubernetes.io/projected/90f9a6f1-0760-4398-80cf-70c615c7032d-kube-api-access-9lm8b\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611413 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-service-ca\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611432 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040462f9-c464-41cc-8843-cac46b3da8bf-config\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611449 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-ca\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611470 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04125307-b213-4579-8042-92284900796b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611527 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611578 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-config\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611602 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611626 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwk4d\" (UniqueName: \"kubernetes.io/projected/ca36a0b9-d7c9-4195-803b-53d41ac683d9-kube-api-access-bwk4d\") pod \"downloads-7954f5f757-69msk\" (UID: \"ca36a0b9-d7c9-4195-803b-53d41ac683d9\") " pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611651 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/491a1079-cbfa-470e-b91b-84e323ae0c6d-metrics-tls\") pod \"dns-operator-744455d44c-ppkhj\" (UID: \"491a1079-cbfa-470e-b91b-84e323ae0c6d\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611692 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bed2a913-4f7d-4a64-aed8-a510280c9b6b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611715 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-audit-policies\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611750 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qbqb\" (UniqueName: \"kubernetes.io/projected/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-kube-api-access-2qbqb\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611788 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cczd7\" (UniqueName: \"kubernetes.io/projected/d6acff1e-cd79-44a7-bb48-1a79857b2a97-kube-api-access-cczd7\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611808 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-console-config\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611826 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-service-ca\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611952 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04125307-b213-4579-8042-92284900796b-trusted-ca\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612018 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612032 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bed2a913-4f7d-4a64-aed8-a510280c9b6b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612053 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612074 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mhr2\" (UniqueName: \"kubernetes.io/projected/491a1079-cbfa-470e-b91b-84e323ae0c6d-kube-api-access-6mhr2\") pod \"dns-operator-744455d44c-ppkhj\" (UID: \"491a1079-cbfa-470e-b91b-84e323ae0c6d\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612096 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612071 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612120 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612192 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-serving-cert\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612245 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-serving-cert\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612268 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btm62\" (UniqueName: \"kubernetes.io/projected/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-kube-api-access-btm62\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612291 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b9kh\" (UniqueName: \"kubernetes.io/projected/04125307-b213-4579-8042-92284900796b-kube-api-access-7b9kh\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613089 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040462f9-c464-41cc-8843-cac46b3da8bf-serving-cert\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613144 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09a0b780-3bf5-4607-9907-33e16ae4f098-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-stgnk\" (UID: \"09a0b780-3bf5-4607-9907-33e16ae4f098\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613174 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d6acff1e-cd79-44a7-bb48-1a79857b2a97-machine-approver-tls\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613181 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613197 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qmdc\" (UniqueName: \"kubernetes.io/projected/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-kube-api-access-2qmdc\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613257 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613340 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613989 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552226-jp7d9"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.614971 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.616110 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ndt59"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.617094 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5fqgx"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.618058 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.618973 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.620302 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.621391 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.622978 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kzsbn"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.624219 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dh2nm"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.625101 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-bzfz7"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.625928 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.626013 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-x6pxw"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.627205 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.627853 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x6pxw"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.632776 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.644157 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5nj7x"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.645031 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.652869 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.653334 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5nj7x"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.673607 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.693259 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.712085 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714488 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-config\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714527 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg6gc\" (UniqueName: \"kubernetes.io/projected/040462f9-c464-41cc-8843-cac46b3da8bf-kube-api-access-lg6gc\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714559 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j54tj\" (UniqueName: \"kubernetes.io/projected/09a0b780-3bf5-4607-9907-33e16ae4f098-kube-api-access-j54tj\") pod \"cluster-samples-operator-665b6dd947-stgnk\" (UID: \"09a0b780-3bf5-4607-9907-33e16ae4f098\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714580 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-encryption-config\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714601 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714626 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714649 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04118e18-43d2-4aed-9812-aba776c0bf61-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714674 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-policies\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714695 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lm8b\" (UniqueName: \"kubernetes.io/projected/90f9a6f1-0760-4398-80cf-70c615c7032d-kube-api-access-9lm8b\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714711 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-ca\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714731 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-service-ca\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714750 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040462f9-c464-41cc-8843-cac46b3da8bf-config\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714773 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04125307-b213-4579-8042-92284900796b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714792 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714813 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-config\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714832 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714851 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwk4d\" (UniqueName: \"kubernetes.io/projected/ca36a0b9-d7c9-4195-803b-53d41ac683d9-kube-api-access-bwk4d\") pod \"downloads-7954f5f757-69msk\" (UID: \"ca36a0b9-d7c9-4195-803b-53d41ac683d9\") " pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714869 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/491a1079-cbfa-470e-b91b-84e323ae0c6d-metrics-tls\") pod \"dns-operator-744455d44c-ppkhj\" (UID: \"491a1079-cbfa-470e-b91b-84e323ae0c6d\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714891 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bed2a913-4f7d-4a64-aed8-a510280c9b6b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714906 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-audit-policies\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714923 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qbqb\" (UniqueName: \"kubernetes.io/projected/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-kube-api-access-2qbqb\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714952 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cczd7\" (UniqueName: \"kubernetes.io/projected/d6acff1e-cd79-44a7-bb48-1a79857b2a97-kube-api-access-cczd7\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714970 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-console-config\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714991 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-service-ca\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715008 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04125307-b213-4579-8042-92284900796b-trusted-ca\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715037 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bed2a913-4f7d-4a64-aed8-a510280c9b6b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715447 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-config\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715759 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715801 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715821 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mhr2\" (UniqueName: \"kubernetes.io/projected/491a1079-cbfa-470e-b91b-84e323ae0c6d-kube-api-access-6mhr2\") pod \"dns-operator-744455d44c-ppkhj\" (UID: \"491a1079-cbfa-470e-b91b-84e323ae0c6d\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715845 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715871 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-serving-cert\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715894 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-serving-cert\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715916 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btm62\" (UniqueName: \"kubernetes.io/projected/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-kube-api-access-btm62\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715936 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b9kh\" (UniqueName: \"kubernetes.io/projected/04125307-b213-4579-8042-92284900796b-kube-api-access-7b9kh\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715969 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040462f9-c464-41cc-8843-cac46b3da8bf-serving-cert\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715989 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09a0b780-3bf5-4607-9907-33e16ae4f098-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-stgnk\" (UID: \"09a0b780-3bf5-4607-9907-33e16ae4f098\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716012 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d6acff1e-cd79-44a7-bb48-1a79857b2a97-machine-approver-tls\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716032 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qmdc\" (UniqueName: \"kubernetes.io/projected/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-kube-api-access-2qmdc\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716049 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-audit-policies\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716112 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-policies\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716256 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716055 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716371 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716411 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6acff1e-cd79-44a7-bb48-1a79857b2a97-auth-proxy-config\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716443 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93468e41-3e48-469f-90a9-7e05e45fe141-serving-cert\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716468 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04125307-b213-4579-8042-92284900796b-metrics-tls\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716523 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-serving-cert\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716573 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fw5p\" (UniqueName: \"kubernetes.io/projected/dd173309-9e96-468f-a21c-f25c86186744-kube-api-access-8fw5p\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716604 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716626 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716652 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-oauth-serving-cert\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716674 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716704 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/040462f9-c464-41cc-8843-cac46b3da8bf-trusted-ca\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716734 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-trusted-ca-bundle\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716759 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-images\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716787 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcx47\" (UniqueName: \"kubernetes.io/projected/bed2a913-4f7d-4a64-aed8-a510280c9b6b-kube-api-access-lcx47\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716808 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716828 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-config\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716847 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-audit-dir\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716866 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04118e18-43d2-4aed-9812-aba776c0bf61-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716887 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716906 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716893 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716920 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-console-config\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716999 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-client\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717030 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717059 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717084 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmprt\" (UniqueName: \"kubernetes.io/projected/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-kube-api-access-vmprt\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717122 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xknbc\" (UniqueName: \"kubernetes.io/projected/93468e41-3e48-469f-90a9-7e05e45fe141-kube-api-access-xknbc\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717147 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-etcd-client\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717182 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-oauth-config\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717204 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bed2a913-4f7d-4a64-aed8-a510280c9b6b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717231 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717255 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8pnh\" (UniqueName: \"kubernetes.io/projected/04118e18-43d2-4aed-9812-aba776c0bf61-kube-api-access-d8pnh\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717280 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-client-ca\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717304 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-dir\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717323 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-service-ca\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717328 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717390 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717398 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6acff1e-cd79-44a7-bb48-1a79857b2a97-config\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717428 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-serving-cert\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717452 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-config\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717516 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040462f9-c464-41cc-8843-cac46b3da8bf-config\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717883 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-oauth-serving-cert\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.718212 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6acff1e-cd79-44a7-bb48-1a79857b2a97-config\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.718568 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bed2a913-4f7d-4a64-aed8-a510280c9b6b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.719777 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.720873 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/040462f9-c464-41cc-8843-cac46b3da8bf-trusted-ca\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716466 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-service-ca\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.721367 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6acff1e-cd79-44a7-bb48-1a79857b2a97-auth-proxy-config\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.721580 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-trusted-ca-bundle\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.722340 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-images\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.722767 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-etcd-client\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.723099 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.723406 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-serving-cert\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.723432 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-audit-dir\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.723501 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-dir\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.724738 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-encryption-config\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.725003 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04118e18-43d2-4aed-9812-aba776c0bf61-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.725126 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09a0b780-3bf5-4607-9907-33e16ae4f098-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-stgnk\" (UID: \"09a0b780-3bf5-4607-9907-33e16ae4f098\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.725299 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.725361 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040462f9-c464-41cc-8843-cac46b3da8bf-serving-cert\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.725645 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.725803 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-oauth-config\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726082 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-client-ca\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726111 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726141 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04118e18-43d2-4aed-9812-aba776c0bf61-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726188 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-config\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726419 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/491a1079-cbfa-470e-b91b-84e323ae0c6d-metrics-tls\") pod \"dns-operator-744455d44c-ppkhj\" (UID: \"491a1079-cbfa-470e-b91b-84e323ae0c6d\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726541 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726530 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-serving-cert\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726879 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.727305 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.727324 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.727458 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.727690 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.727904 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-serving-cert\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.728229 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bed2a913-4f7d-4a64-aed8-a510280c9b6b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.728420 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-serving-cert\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.728645 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d6acff1e-cd79-44a7-bb48-1a79857b2a97-machine-approver-tls\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.728946 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.732678 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.740916 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-client\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.752533 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.773392 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.784346 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93468e41-3e48-469f-90a9-7e05e45fe141-serving-cert\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.792663 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.796185 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-ca\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.812799 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.832138 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.838156 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-config\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.852198 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.872857 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.892251 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.912571 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.932436 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.952517 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.965467 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04125307-b213-4579-8042-92284900796b-metrics-tls\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.972922 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.998985 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.006994 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04125307-b213-4579-8042-92284900796b-trusted-ca\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.012499 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.032620 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.072874 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.092181 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.112323 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.132588 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.152940 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.158235 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.172873 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.192763 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.197277 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-config\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.212513 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.252649 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.272142 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.292417 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.313156 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.332843 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.352274 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.372440 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.392913 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.412116 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.433266 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.452756 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.472951 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.492841 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.512326 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.532495 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.553150 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.571648 4883 request.go:700] Waited for 1.015040306s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.572789 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.592166 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.613079 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.632355 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.652628 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.672985 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.692709 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.712927 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.733241 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.752862 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.772234 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.792576 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.812687 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.832107 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.852689 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.872387 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.893075 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.913046 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.932076 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.952125 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.972203 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.992704 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.013016 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.032755 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.052342 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.072189 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.093018 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.112071 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.133010 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.157700 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.172116 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.192014 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.213145 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.233116 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.252503 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.272452 4883 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.292937 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.312425 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.332422 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.353049 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.372351 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.391872 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.412230 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.432409 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.452812 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.472573 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.492824 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.512573 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.545647 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg6gc\" (UniqueName: \"kubernetes.io/projected/040462f9-c464-41cc-8843-cac46b3da8bf-kube-api-access-lg6gc\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.566229 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j54tj\" (UniqueName: \"kubernetes.io/projected/09a0b780-3bf5-4607-9907-33e16ae4f098-kube-api-access-j54tj\") pod \"cluster-samples-operator-665b6dd947-stgnk\" (UID: \"09a0b780-3bf5-4607-9907-33e16ae4f098\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.571737 4883 request.go:700] Waited for 1.856552989s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.584283 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lm8b\" (UniqueName: \"kubernetes.io/projected/90f9a6f1-0760-4398-80cf-70c615c7032d-kube-api-access-9lm8b\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.601616 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.604694 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cczd7\" (UniqueName: \"kubernetes.io/projected/d6acff1e-cd79-44a7-bb48-1a79857b2a97-kube-api-access-cczd7\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.624615 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qbqb\" (UniqueName: \"kubernetes.io/projected/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-kube-api-access-2qbqb\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.644215 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bed2a913-4f7d-4a64-aed8-a510280c9b6b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.645098 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.656309 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.663552 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwk4d\" (UniqueName: \"kubernetes.io/projected/ca36a0b9-d7c9-4195-803b-53d41ac683d9-kube-api-access-bwk4d\") pod \"downloads-7954f5f757-69msk\" (UID: \"ca36a0b9-d7c9-4195-803b-53d41ac683d9\") " pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.669445 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:13 crc kubenswrapper[4883]: W0310 09:07:13.669625 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6acff1e_cd79_44a7_bb48_1a79857b2a97.slice/crio-c70a33bb3187dfa992af916aeaf8ced911bbee6b9fdd7a934eaa3610f16c201d WatchSource:0}: Error finding container c70a33bb3187dfa992af916aeaf8ced911bbee6b9fdd7a934eaa3610f16c201d: Status 404 returned error can't find the container with id c70a33bb3187dfa992af916aeaf8ced911bbee6b9fdd7a934eaa3610f16c201d Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.681759 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.687813 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04125307-b213-4579-8042-92284900796b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.706834 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mhr2\" (UniqueName: \"kubernetes.io/projected/491a1079-cbfa-470e-b91b-84e323ae0c6d-kube-api-access-6mhr2\") pod \"dns-operator-744455d44c-ppkhj\" (UID: \"491a1079-cbfa-470e-b91b-84e323ae0c6d\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.727763 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btm62\" (UniqueName: \"kubernetes.io/projected/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-kube-api-access-btm62\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.744449 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b9kh\" (UniqueName: \"kubernetes.io/projected/04125307-b213-4579-8042-92284900796b-kube-api-access-7b9kh\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.766193 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcx47\" (UniqueName: \"kubernetes.io/projected/bed2a913-4f7d-4a64-aed8-a510280c9b6b-kube-api-access-lcx47\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.778945 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-42rrg"] Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.786678 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmprt\" (UniqueName: \"kubernetes.io/projected/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-kube-api-access-vmprt\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:13 crc kubenswrapper[4883]: W0310 09:07:13.789662 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod040462f9_c464_41cc_8843_cac46b3da8bf.slice/crio-d0223f42f5794235bdc257e8a1d33d6a24f61139c5258c9520410fdbf1e6dcfa WatchSource:0}: Error finding container d0223f42f5794235bdc257e8a1d33d6a24f61139c5258c9520410fdbf1e6dcfa: Status 404 returned error can't find the container with id d0223f42f5794235bdc257e8a1d33d6a24f61139c5258c9520410fdbf1e6dcfa Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.801113 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr"] Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.802835 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xknbc\" (UniqueName: \"kubernetes.io/projected/93468e41-3e48-469f-90a9-7e05e45fe141-kube-api-access-xknbc\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:13 crc kubenswrapper[4883]: W0310 09:07:13.805694 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4210a360_cb3e_4fa8_8fd1_98217c9b00f2.slice/crio-de4450a201dc516a6ff019c78a5584d2658c919f722b342324e54eb5dc7c53aa WatchSource:0}: Error finding container de4450a201dc516a6ff019c78a5584d2658c919f722b342324e54eb5dc7c53aa: Status 404 returned error can't find the container with id de4450a201dc516a6ff019c78a5584d2658c919f722b342324e54eb5dc7c53aa Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.824833 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qmdc\" (UniqueName: \"kubernetes.io/projected/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-kube-api-access-2qmdc\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.828946 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-69msk"] Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.846968 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fw5p\" (UniqueName: \"kubernetes.io/projected/dd173309-9e96-468f-a21c-f25c86186744-kube-api-access-8fw5p\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.860772 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.864004 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8pnh\" (UniqueName: \"kubernetes.io/projected/04118e18-43d2-4aed-9812-aba776c0bf61-kube-api-access-d8pnh\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.865899 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.878099 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.885261 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.890534 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.926747 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" event={"ID":"4210a360-cb3e-4fa8-8fd1-98217c9b00f2","Type":"ContainerStarted","Data":"de4450a201dc516a6ff019c78a5584d2658c919f722b342324e54eb5dc7c53aa"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.928402 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-69msk" event={"ID":"ca36a0b9-d7c9-4195-803b-53d41ac683d9","Type":"ContainerStarted","Data":"de329dc36faddd60c64b1dedfa63d8ddca8d095dee912c9e5a31c84a00e1c11c"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.928440 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-69msk" event={"ID":"ca36a0b9-d7c9-4195-803b-53d41ac683d9","Type":"ContainerStarted","Data":"22f8d183eb153b2ddb33201c0ba167d5aa42fd502fa549946bc80a934e74c703"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.929453 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.934115 4883 patch_prober.go:28] interesting pod/downloads-7954f5f757-69msk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.934142 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-69msk" podUID="ca36a0b9-d7c9-4195-803b-53d41ac683d9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.938819 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" event={"ID":"d6acff1e-cd79-44a7-bb48-1a79857b2a97","Type":"ContainerStarted","Data":"059bd85844f9b25014ac9c4e79a79d2232e103f2f10f9db59e7c5a5f37f09c50"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.938848 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" event={"ID":"d6acff1e-cd79-44a7-bb48-1a79857b2a97","Type":"ContainerStarted","Data":"c70a33bb3187dfa992af916aeaf8ced911bbee6b9fdd7a934eaa3610f16c201d"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942610 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942668 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da524055-8528-423f-9ccd-70198a4fbf99-serving-cert\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942700 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-config\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942717 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-image-import-ca\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942765 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-trusted-ca\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942781 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp2gg\" (UniqueName: \"kubernetes.io/projected/7ba6ab17-ada9-4712-bc66-09172d648791-kube-api-access-gp2gg\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942798 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9ea088-9f19-4839-bfe4-ce54842b04c2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942833 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d9ea088-9f19-4839-bfe4-ce54842b04c2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942861 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942897 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-bound-sa-token\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942914 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-serving-cert\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942931 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d94eaa88-cfd0-497d-804d-922ebd316b33-audit-dir\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942947 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-config\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942981 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-audit\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943000 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba6ab17-ada9-4712-bc66-09172d648791-serving-cert\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943019 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943034 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-client-ca\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943081 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed479632-f556-407c-a8a9-b40379bbf549-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943110 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943147 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9wjh\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-kube-api-access-q9wjh\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943166 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94eaa88-cfd0-497d-804d-922ebd316b33-node-pullsecrets\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943190 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-tls\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943231 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed479632-f556-407c-a8a9-b40379bbf549-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943250 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed479632-f556-407c-a8a9-b40379bbf549-config\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943269 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-etcd-serving-ca\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943325 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-certificates\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943345 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943382 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-service-ca-bundle\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943405 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-encryption-config\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943422 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-config\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943457 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943527 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-etcd-client\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943565 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943607 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdplz\" (UniqueName: \"kubernetes.io/projected/da524055-8528-423f-9ccd-70198a4fbf99-kube-api-access-sdplz\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943633 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb4j7\" (UniqueName: \"kubernetes.io/projected/8d9ea088-9f19-4839-bfe4-ce54842b04c2-kube-api-access-tb4j7\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943773 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl5n5\" (UniqueName: \"kubernetes.io/projected/d94eaa88-cfd0-497d-804d-922ebd316b33-kube-api-access-zl5n5\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943809 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:13 crc kubenswrapper[4883]: E0310 09:07:13.943949 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.443937002 +0000 UTC m=+220.698834891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.945725 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-42rrg" event={"ID":"040462f9-c464-41cc-8843-cac46b3da8bf","Type":"ContainerStarted","Data":"6f8cc6d827bc39c1c8ed710e9a87c7da199a7cac4ffbb7438cd44cda65d0f820"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.945750 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-42rrg" event={"ID":"040462f9-c464-41cc-8843-cac46b3da8bf","Type":"ContainerStarted","Data":"d0223f42f5794235bdc257e8a1d33d6a24f61139c5258c9520410fdbf1e6dcfa"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.946049 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.947945 4883 patch_prober.go:28] interesting pod/console-operator-58897d9998-42rrg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.947979 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-42rrg" podUID="040462f9-c464-41cc-8843-cac46b3da8bf" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.963760 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.975965 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.988130 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.993212 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.021014 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.029933 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.033948 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7986"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.043868 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.045722 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.045903 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.545882691 +0000 UTC m=+220.800780580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.050555 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-srv-cert\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.050851 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-245mx\" (UniqueName: \"kubernetes.io/projected/459d25fc-b392-4a73-bfce-6250fc05c6e4-kube-api-access-245mx\") pod \"ingress-canary-5nj7x\" (UID: \"459d25fc-b392-4a73-bfce-6250fc05c6e4\") " pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.050929 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lldvp\" (UniqueName: \"kubernetes.io/projected/74014fb3-ee38-481a-a27f-f12ff7f2c29a-kube-api-access-lldvp\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.050975 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f81e48af-a943-4b68-b259-3c0685529d42-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h6zbd\" (UID: \"f81e48af-a943-4b68-b259-3c0685529d42\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051001 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3c05291a-8935-4f5e-81c8-4523b3b7e558-srv-cert\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051023 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-certificates\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051045 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggmxb\" (UniqueName: \"kubernetes.io/projected/89e1c086-5372-40ce-859d-3eb64bb06012-kube-api-access-ggmxb\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051063 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbb98199-ce3b-4a19-bc11-a4c55d8e8df2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dh2nm\" (UID: \"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051094 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051116 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbf6q\" (UniqueName: \"kubernetes.io/projected/cf87b69c-5c1e-4297-82c9-ff39bf48b628-kube-api-access-tbf6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051135 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e56425c4-e04a-4313-a946-efc4ddac49ee-service-ca-bundle\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051167 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/57fad383-2bee-48b1-b513-32a629c976aa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051189 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-csi-data-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051224 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-config\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051244 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb4j7\" (UniqueName: \"kubernetes.io/projected/8d9ea088-9f19-4839-bfe4-ce54842b04c2-kube-api-access-tb4j7\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051266 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl5n5\" (UniqueName: \"kubernetes.io/projected/d94eaa88-cfd0-497d-804d-922ebd316b33-kube-api-access-zl5n5\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051289 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9409438-97ce-43a6-8a7f-24764925eb53-metrics-tls\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051307 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tksbw\" (UniqueName: \"kubernetes.io/projected/fbb98199-ce3b-4a19-bc11-a4c55d8e8df2-kube-api-access-tksbw\") pod \"multus-admission-controller-857f4d67dd-dh2nm\" (UID: \"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051341 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051356 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-socket-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051384 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da524055-8528-423f-9ccd-70198a4fbf99-serving-cert\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051403 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-image-import-ca\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051418 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ftn9\" (UniqueName: \"kubernetes.io/projected/3c05291a-8935-4f5e-81c8-4523b3b7e558-kube-api-access-8ftn9\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051436 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf87b69c-5c1e-4297-82c9-ff39bf48b628-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051466 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/828338b4-f6a3-4a38-9596-2556459de30a-config\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051527 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp2gg\" (UniqueName: \"kubernetes.io/projected/7ba6ab17-ada9-4712-bc66-09172d648791-kube-api-access-gp2gg\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051552 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/854dee0a-96a6-41f9-bdbe-d0d820684605-signing-key\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051567 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf87b69c-5c1e-4297-82c9-ff39bf48b628-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051584 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89e1c086-5372-40ce-859d-3eb64bb06012-webhook-cert\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051605 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d9ea088-9f19-4839-bfe4-ce54842b04c2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053743 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053778 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-serving-cert\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053798 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-config\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053823 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba6ab17-ada9-4712-bc66-09172d648791-serving-cert\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053842 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-plugins-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053882 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-metrics-certs\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053906 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ec510e9-f96b-44da-abec-7d49115d0c83-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dlh8b\" (UID: \"7ec510e9-f96b-44da-abec-7d49115d0c83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054135 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed479632-f556-407c-a8a9-b40379bbf549-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054191 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-registration-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054222 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57fad383-2bee-48b1-b513-32a629c976aa-proxy-tls\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054245 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9wjh\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-kube-api-access-q9wjh\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054265 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6fb9cd04-d1cb-446b-9bab-b054c51df85c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054288 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-tls\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054309 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-certs\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054329 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7zhd\" (UniqueName: \"kubernetes.io/projected/7ec510e9-f96b-44da-abec-7d49115d0c83-kube-api-access-t7zhd\") pod \"control-plane-machine-set-operator-78cbb6b69f-dlh8b\" (UID: \"7ec510e9-f96b-44da-abec-7d49115d0c83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054345 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-image-import-ca\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054420 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054452 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-etcd-serving-ca\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054490 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d8qw\" (UniqueName: \"kubernetes.io/projected/f81e48af-a943-4b68-b259-3c0685529d42-kube-api-access-4d8qw\") pod \"package-server-manager-789f6589d5-h6zbd\" (UID: \"f81e48af-a943-4b68-b259-3c0685529d42\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054536 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-node-bootstrap-token\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054560 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/828338b4-f6a3-4a38-9596-2556459de30a-serving-cert\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054591 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkbdd\" (UniqueName: \"kubernetes.io/projected/632d4971-be4e-4939-a46a-42604b182436-kube-api-access-wkbdd\") pod \"auto-csr-approver-29552226-jp7d9\" (UID: \"632d4971-be4e-4939-a46a-42604b182436\") " pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054610 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc6sv\" (UniqueName: \"kubernetes.io/projected/854dee0a-96a6-41f9-bdbe-d0d820684605-kube-api-access-kc6sv\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054630 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-stats-auth\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054647 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89e1c086-5372-40ce-859d-3eb64bb06012-apiservice-cert\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054668 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7shgc\" (UniqueName: \"kubernetes.io/projected/57fad383-2bee-48b1-b513-32a629c976aa-kube-api-access-7shgc\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054689 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47s58\" (UniqueName: \"kubernetes.io/projected/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-kube-api-access-47s58\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054945 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d9ea088-9f19-4839-bfe4-ce54842b04c2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.055642 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.555628266 +0000 UTC m=+220.810526155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.056588 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-config\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.056661 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-etcd-serving-ca\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.056884 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-service-ca-bundle\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058752 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-service-ca-bundle\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058821 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llg2x\" (UniqueName: \"kubernetes.io/projected/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-kube-api-access-llg2x\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058842 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmhgt\" (UniqueName: \"kubernetes.io/projected/6fb9cd04-d1cb-446b-9bab-b054c51df85c-kube-api-access-jmhgt\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058867 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-encryption-config\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058905 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058923 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6fb9cd04-d1cb-446b-9bab-b054c51df85c-proxy-tls\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058952 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-etcd-client\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058969 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.059028 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdplz\" (UniqueName: \"kubernetes.io/projected/da524055-8528-423f-9ccd-70198a4fbf99-kube-api-access-sdplz\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.059132 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-certificates\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.059995 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-config\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.060200 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.061177 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.061645 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.061718 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be14f8e-b9d8-4058-9be3-cdc61ce88626-secret-volume\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.061767 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-config\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.061786 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/854dee0a-96a6-41f9-bdbe-d0d820684605-signing-cabundle\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.061889 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062036 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9409438-97ce-43a6-8a7f-24764925eb53-config-volume\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062069 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/89e1c086-5372-40ce-859d-3eb64bb06012-tmpfs\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062123 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-trusted-ca\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062142 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9ea088-9f19-4839-bfe4-ce54842b04c2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062162 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062205 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhbbw\" (UniqueName: \"kubernetes.io/projected/828338b4-f6a3-4a38-9596-2556459de30a-kube-api-access-zhbbw\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062223 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3c05291a-8935-4f5e-81c8-4523b3b7e558-profile-collector-cert\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062274 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-bound-sa-token\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062291 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d94eaa88-cfd0-497d-804d-922ebd316b33-audit-dir\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062308 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bml5\" (UniqueName: \"kubernetes.io/projected/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-kube-api-access-9bml5\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062325 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-audit\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062342 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6fb9cd04-d1cb-446b-9bab-b054c51df85c-images\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062359 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062420 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-mountpoint-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062439 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-client-ca\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062534 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be14f8e-b9d8-4058-9be3-cdc61ce88626-config-volume\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.063322 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-config\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.064854 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d94eaa88-cfd0-497d-804d-922ebd316b33-audit-dir\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.065119 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/459d25fc-b392-4a73-bfce-6250fc05c6e4-cert\") pod \"ingress-canary-5nj7x\" (UID: \"459d25fc-b392-4a73-bfce-6250fc05c6e4\") " pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.065152 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhbbx\" (UniqueName: \"kubernetes.io/projected/6e2199dc-f886-4cde-aab8-60f4e4823840-kube-api-access-vhbbx\") pod \"migrator-59844c95c7-sw994\" (UID: \"6e2199dc-f886-4cde-aab8-60f4e4823840\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.065155 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-trusted-ca\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.065625 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7txw4\" (UniqueName: \"kubernetes.io/projected/0be14f8e-b9d8-4058-9be3-cdc61ce88626-kube-api-access-7txw4\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.065772 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-client-ca\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066467 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066517 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94eaa88-cfd0-497d-804d-922ebd316b33-node-pullsecrets\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066538 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-default-certificate\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066585 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2d7p\" (UniqueName: \"kubernetes.io/projected/e56425c4-e04a-4313-a946-efc4ddac49ee-kube-api-access-h2d7p\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066607 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed479632-f556-407c-a8a9-b40379bbf549-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066625 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mnqz\" (UniqueName: \"kubernetes.io/projected/e9409438-97ce-43a6-8a7f-24764925eb53-kube-api-access-7mnqz\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066713 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-audit\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066773 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed479632-f556-407c-a8a9-b40379bbf549-config\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.067313 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.069723 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94eaa88-cfd0-497d-804d-922ebd316b33-node-pullsecrets\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.070362 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed479632-f556-407c-a8a9-b40379bbf549-config\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.071207 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed479632-f556-407c-a8a9-b40379bbf549-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.071900 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.072239 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.072980 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9ea088-9f19-4839-bfe4-ce54842b04c2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.075448 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-etcd-client\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.075614 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.076997 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.082239 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-serving-cert\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.082841 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba6ab17-ada9-4712-bc66-09172d648791-serving-cert\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.083230 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da524055-8528-423f-9ccd-70198a4fbf99-serving-cert\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.084765 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.086709 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-tls\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.090924 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-encryption-config\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.091617 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pzn5l"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.094624 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp2gg\" (UniqueName: \"kubernetes.io/projected/7ba6ab17-ada9-4712-bc66-09172d648791-kube-api-access-gp2gg\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.107651 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb4j7\" (UniqueName: \"kubernetes.io/projected/8d9ea088-9f19-4839-bfe4-ce54842b04c2-kube-api-access-tb4j7\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.125136 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl5n5\" (UniqueName: \"kubernetes.io/projected/d94eaa88-cfd0-497d-804d-922ebd316b33-kube-api-access-zl5n5\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: W0310 09:07:14.138262 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a7ee07_f81d_4e5a_aeea_b399aa39a31c.slice/crio-5fe2acbfc74cb4e318cab50fef7688fbdf7157e5096d64dd57c1d234af6fa2f6 WatchSource:0}: Error finding container 5fe2acbfc74cb4e318cab50fef7688fbdf7157e5096d64dd57c1d234af6fa2f6: Status 404 returned error can't find the container with id 5fe2acbfc74cb4e318cab50fef7688fbdf7157e5096d64dd57c1d234af6fa2f6 Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.153528 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed479632-f556-407c-a8a9-b40379bbf549-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.155495 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.168876 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.169113 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.669080053 +0000 UTC m=+220.923977942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169434 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ec510e9-f96b-44da-abec-7d49115d0c83-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dlh8b\" (UID: \"7ec510e9-f96b-44da-abec-7d49115d0c83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169492 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-registration-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169511 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57fad383-2bee-48b1-b513-32a629c976aa-proxy-tls\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169537 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6fb9cd04-d1cb-446b-9bab-b054c51df85c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169558 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9wjh\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-kube-api-access-q9wjh\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169582 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169600 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-certs\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169618 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7zhd\" (UniqueName: \"kubernetes.io/projected/7ec510e9-f96b-44da-abec-7d49115d0c83-kube-api-access-t7zhd\") pod \"control-plane-machine-set-operator-78cbb6b69f-dlh8b\" (UID: \"7ec510e9-f96b-44da-abec-7d49115d0c83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169649 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d8qw\" (UniqueName: \"kubernetes.io/projected/f81e48af-a943-4b68-b259-3c0685529d42-kube-api-access-4d8qw\") pod \"package-server-manager-789f6589d5-h6zbd\" (UID: \"f81e48af-a943-4b68-b259-3c0685529d42\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169672 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-node-bootstrap-token\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169692 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/828338b4-f6a3-4a38-9596-2556459de30a-serving-cert\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170222 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkbdd\" (UniqueName: \"kubernetes.io/projected/632d4971-be4e-4939-a46a-42604b182436-kube-api-access-wkbdd\") pod \"auto-csr-approver-29552226-jp7d9\" (UID: \"632d4971-be4e-4939-a46a-42604b182436\") " pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170254 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc6sv\" (UniqueName: \"kubernetes.io/projected/854dee0a-96a6-41f9-bdbe-d0d820684605-kube-api-access-kc6sv\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170279 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-stats-auth\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170338 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89e1c086-5372-40ce-859d-3eb64bb06012-apiservice-cert\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170365 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7shgc\" (UniqueName: \"kubernetes.io/projected/57fad383-2bee-48b1-b513-32a629c976aa-kube-api-access-7shgc\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170393 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47s58\" (UniqueName: \"kubernetes.io/projected/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-kube-api-access-47s58\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170416 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmhgt\" (UniqueName: \"kubernetes.io/projected/6fb9cd04-d1cb-446b-9bab-b054c51df85c-kube-api-access-jmhgt\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170438 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llg2x\" (UniqueName: \"kubernetes.io/projected/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-kube-api-access-llg2x\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170456 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6fb9cd04-d1cb-446b-9bab-b054c51df85c-proxy-tls\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170523 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170576 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be14f8e-b9d8-4058-9be3-cdc61ce88626-secret-volume\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170598 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/854dee0a-96a6-41f9-bdbe-d0d820684605-signing-cabundle\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170623 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9409438-97ce-43a6-8a7f-24764925eb53-config-volume\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170639 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/89e1c086-5372-40ce-859d-3eb64bb06012-tmpfs\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170658 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170682 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhbbw\" (UniqueName: \"kubernetes.io/projected/828338b4-f6a3-4a38-9596-2556459de30a-kube-api-access-zhbbw\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170700 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3c05291a-8935-4f5e-81c8-4523b3b7e558-profile-collector-cert\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170734 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bml5\" (UniqueName: \"kubernetes.io/projected/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-kube-api-access-9bml5\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170750 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6fb9cd04-d1cb-446b-9bab-b054c51df85c-images\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170770 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-mountpoint-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170788 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be14f8e-b9d8-4058-9be3-cdc61ce88626-config-volume\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170809 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhbbx\" (UniqueName: \"kubernetes.io/projected/6e2199dc-f886-4cde-aab8-60f4e4823840-kube-api-access-vhbbx\") pod \"migrator-59844c95c7-sw994\" (UID: \"6e2199dc-f886-4cde-aab8-60f4e4823840\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170827 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7txw4\" (UniqueName: \"kubernetes.io/projected/0be14f8e-b9d8-4058-9be3-cdc61ce88626-kube-api-access-7txw4\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170849 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/459d25fc-b392-4a73-bfce-6250fc05c6e4-cert\") pod \"ingress-canary-5nj7x\" (UID: \"459d25fc-b392-4a73-bfce-6250fc05c6e4\") " pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170883 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-default-certificate\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170903 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2d7p\" (UniqueName: \"kubernetes.io/projected/e56425c4-e04a-4313-a946-efc4ddac49ee-kube-api-access-h2d7p\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170919 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mnqz\" (UniqueName: \"kubernetes.io/projected/e9409438-97ce-43a6-8a7f-24764925eb53-kube-api-access-7mnqz\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170937 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-srv-cert\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170951 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-245mx\" (UniqueName: \"kubernetes.io/projected/459d25fc-b392-4a73-bfce-6250fc05c6e4-kube-api-access-245mx\") pod \"ingress-canary-5nj7x\" (UID: \"459d25fc-b392-4a73-bfce-6250fc05c6e4\") " pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170971 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lldvp\" (UniqueName: \"kubernetes.io/projected/74014fb3-ee38-481a-a27f-f12ff7f2c29a-kube-api-access-lldvp\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171003 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f81e48af-a943-4b68-b259-3c0685529d42-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h6zbd\" (UID: \"f81e48af-a943-4b68-b259-3c0685529d42\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171022 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3c05291a-8935-4f5e-81c8-4523b3b7e558-srv-cert\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171041 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggmxb\" (UniqueName: \"kubernetes.io/projected/89e1c086-5372-40ce-859d-3eb64bb06012-kube-api-access-ggmxb\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171060 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbb98199-ce3b-4a19-bc11-a4c55d8e8df2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dh2nm\" (UID: \"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171078 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbf6q\" (UniqueName: \"kubernetes.io/projected/cf87b69c-5c1e-4297-82c9-ff39bf48b628-kube-api-access-tbf6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171096 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e56425c4-e04a-4313-a946-efc4ddac49ee-service-ca-bundle\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171115 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/57fad383-2bee-48b1-b513-32a629c976aa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171130 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-csi-data-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171494 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6fb9cd04-d1cb-446b-9bab-b054c51df85c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.172182 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-registration-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.172658 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57fad383-2bee-48b1-b513-32a629c976aa-proxy-tls\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.174020 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/854dee0a-96a6-41f9-bdbe-d0d820684605-signing-cabundle\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.174624 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.174948 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/89e1c086-5372-40ce-859d-3eb64bb06012-tmpfs\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.175248 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9409438-97ce-43a6-8a7f-24764925eb53-config-volume\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.176710 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9409438-97ce-43a6-8a7f-24764925eb53-metrics-tls\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177084 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tksbw\" (UniqueName: \"kubernetes.io/projected/fbb98199-ce3b-4a19-bc11-a4c55d8e8df2-kube-api-access-tksbw\") pod \"multus-admission-controller-857f4d67dd-dh2nm\" (UID: \"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177118 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-socket-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177331 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ftn9\" (UniqueName: \"kubernetes.io/projected/3c05291a-8935-4f5e-81c8-4523b3b7e558-kube-api-access-8ftn9\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177363 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf87b69c-5c1e-4297-82c9-ff39bf48b628-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177390 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/828338b4-f6a3-4a38-9596-2556459de30a-config\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177411 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf87b69c-5c1e-4297-82c9-ff39bf48b628-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177410 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-socket-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177428 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89e1c086-5372-40ce-859d-3eb64bb06012-webhook-cert\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177600 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-mountpoint-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.178607 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/854dee0a-96a6-41f9-bdbe-d0d820684605-signing-key\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.179180 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-csi-data-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.179343 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.179488 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-plugins-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.179899 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.679882286 +0000 UTC m=+220.934780176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.180324 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-plugins-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.180415 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6fb9cd04-d1cb-446b-9bab-b054c51df85c-images\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.180538 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-metrics-certs\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.180743 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6fb9cd04-d1cb-446b-9bab-b054c51df85c-proxy-tls\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.187253 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f81e48af-a943-4b68-b259-3c0685529d42-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h6zbd\" (UID: \"f81e48af-a943-4b68-b259-3c0685529d42\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.187976 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9409438-97ce-43a6-8a7f-24764925eb53-metrics-tls\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.187978 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.189153 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3c05291a-8935-4f5e-81c8-4523b3b7e558-profile-collector-cert\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.191016 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-node-bootstrap-token\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.191552 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-certs\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.191793 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.192207 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/57fad383-2bee-48b1-b513-32a629c976aa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.193232 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89e1c086-5372-40ce-859d-3eb64bb06012-apiservice-cert\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.195331 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89e1c086-5372-40ce-859d-3eb64bb06012-webhook-cert\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.195729 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be14f8e-b9d8-4058-9be3-cdc61ce88626-secret-volume\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.196093 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/828338b4-f6a3-4a38-9596-2556459de30a-serving-cert\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.196118 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ec510e9-f96b-44da-abec-7d49115d0c83-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dlh8b\" (UID: \"7ec510e9-f96b-44da-abec-7d49115d0c83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.196343 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/828338b4-f6a3-4a38-9596-2556459de30a-config\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.196825 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf87b69c-5c1e-4297-82c9-ff39bf48b628-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.197168 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be14f8e-b9d8-4058-9be3-cdc61ce88626-config-volume\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.199345 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf87b69c-5c1e-4297-82c9-ff39bf48b628-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.201043 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.201094 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-stats-auth\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.201141 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-srv-cert\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.201422 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/459d25fc-b392-4a73-bfce-6250fc05c6e4-cert\") pod \"ingress-canary-5nj7x\" (UID: \"459d25fc-b392-4a73-bfce-6250fc05c6e4\") " pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.201835 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-default-certificate\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.202422 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e56425c4-e04a-4313-a946-efc4ddac49ee-service-ca-bundle\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.210269 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3c05291a-8935-4f5e-81c8-4523b3b7e558-srv-cert\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.211149 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/854dee0a-96a6-41f9-bdbe-d0d820684605-signing-key\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.211816 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbb98199-ce3b-4a19-bc11-a4c55d8e8df2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dh2nm\" (UID: \"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.212355 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-metrics-certs\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.217513 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdplz\" (UniqueName: \"kubernetes.io/projected/da524055-8528-423f-9ccd-70198a4fbf99-kube-api-access-sdplz\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.218126 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.218408 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.230246 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.231855 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-bound-sa-token\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.268853 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7shgc\" (UniqueName: \"kubernetes.io/projected/57fad383-2bee-48b1-b513-32a629c976aa-kube-api-access-7shgc\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.281798 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.282251 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.782237867 +0000 UTC m=+221.037135757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.290518 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7zhd\" (UniqueName: \"kubernetes.io/projected/7ec510e9-f96b-44da-abec-7d49115d0c83-kube-api-access-t7zhd\") pod \"control-plane-machine-set-operator-78cbb6b69f-dlh8b\" (UID: \"7ec510e9-f96b-44da-abec-7d49115d0c83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.292943 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7clc9"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.304259 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.317529 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkbdd\" (UniqueName: \"kubernetes.io/projected/632d4971-be4e-4939-a46a-42604b182436-kube-api-access-wkbdd\") pod \"auto-csr-approver-29552226-jp7d9\" (UID: \"632d4971-be4e-4939-a46a-42604b182436\") " pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.323827 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.329768 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc6sv\" (UniqueName: \"kubernetes.io/projected/854dee0a-96a6-41f9-bdbe-d0d820684605-kube-api-access-kc6sv\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.340811 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.348934 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d8qw\" (UniqueName: \"kubernetes.io/projected/f81e48af-a943-4b68-b259-3c0685529d42-kube-api-access-4d8qw\") pod \"package-server-manager-789f6589d5-h6zbd\" (UID: \"f81e48af-a943-4b68-b259-3c0685529d42\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.364854 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4sznb"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.367551 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhbbx\" (UniqueName: \"kubernetes.io/projected/6e2199dc-f886-4cde-aab8-60f4e4823840-kube-api-access-vhbbx\") pod \"migrator-59844c95c7-sw994\" (UID: \"6e2199dc-f886-4cde-aab8-60f4e4823840\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.373009 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.377178 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.385834 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.386624 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.886604674 +0000 UTC m=+221.141502563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.392151 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47s58\" (UniqueName: \"kubernetes.io/projected/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-kube-api-access-47s58\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.409722 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmhgt\" (UniqueName: \"kubernetes.io/projected/6fb9cd04-d1cb-446b-9bab-b054c51df85c-kube-api-access-jmhgt\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.414741 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.434255 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llg2x\" (UniqueName: \"kubernetes.io/projected/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-kube-api-access-llg2x\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.435619 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.453017 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhbbw\" (UniqueName: \"kubernetes.io/projected/828338b4-f6a3-4a38-9596-2556459de30a-kube-api-access-zhbbw\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.464710 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.466851 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbf6q\" (UniqueName: \"kubernetes.io/projected/cf87b69c-5c1e-4297-82c9-ff39bf48b628-kube-api-access-tbf6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.477837 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h5tmh"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.486891 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.487118 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.987079354 +0000 UTC m=+221.241977243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.487181 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.487709 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.987701003 +0000 UTC m=+221.242598893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.506391 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7txw4\" (UniqueName: \"kubernetes.io/projected/0be14f8e-b9d8-4058-9be3-cdc61ce88626-kube-api-access-7txw4\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.507275 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ppkhj"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.509286 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lldvp\" (UniqueName: \"kubernetes.io/projected/74014fb3-ee38-481a-a27f-f12ff7f2c29a-kube-api-access-lldvp\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.513414 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.516323 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.519414 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.523831 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nbvf4"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.526161 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bml5\" (UniqueName: \"kubernetes.io/projected/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-kube-api-access-9bml5\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.526456 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.540610 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-29pxk"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.547280 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2d7p\" (UniqueName: \"kubernetes.io/projected/e56425c4-e04a-4313-a946-efc4ddac49ee-kube-api-access-h2d7p\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.556667 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.570836 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mnqz\" (UniqueName: \"kubernetes.io/projected/e9409438-97ce-43a6-8a7f-24764925eb53-kube-api-access-7mnqz\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.580692 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.585680 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggmxb\" (UniqueName: \"kubernetes.io/projected/89e1c086-5372-40ce-859d-3eb64bb06012-kube-api-access-ggmxb\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.588757 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.589139 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.089127376 +0000 UTC m=+221.344025264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: W0310 09:07:14.602447 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd173309_9e96_468f_a21c_f25c86186744.slice/crio-4ec8d1b5827960f6bd137ff58dec19e8986121f159740a144f7e580ab0677ffa WatchSource:0}: Error finding container 4ec8d1b5827960f6bd137ff58dec19e8986121f159740a144f7e580ab0677ffa: Status 404 returned error can't find the container with id 4ec8d1b5827960f6bd137ff58dec19e8986121f159740a144f7e580ab0677ffa Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.617117 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tksbw\" (UniqueName: \"kubernetes.io/projected/fbb98199-ce3b-4a19-bc11-a4c55d8e8df2-kube-api-access-tksbw\") pod \"multus-admission-controller-857f4d67dd-dh2nm\" (UID: \"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.626324 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ftn9\" (UniqueName: \"kubernetes.io/projected/3c05291a-8935-4f5e-81c8-4523b3b7e558-kube-api-access-8ftn9\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.648633 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-245mx\" (UniqueName: \"kubernetes.io/projected/459d25fc-b392-4a73-bfce-6250fc05c6e4-kube-api-access-245mx\") pod \"ingress-canary-5nj7x\" (UID: \"459d25fc-b392-4a73-bfce-6250fc05c6e4\") " pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.653442 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.660208 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.667630 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.679219 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.681403 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.690052 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.690397 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.19038542 +0000 UTC m=+221.445283309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.690747 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.692411 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.692675 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.701657 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.720487 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.749744 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5fqgx"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.752467 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.758638 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.770356 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.793517 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.794007 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.794430 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.294414971 +0000 UTC m=+221.549312860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.858796 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.873180 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.895052 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.895670 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.39564877 +0000 UTC m=+221.650546659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: W0310 09:07:14.945597 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d9ea088_9f19_4839_bfe4_ce54842b04c2.slice/crio-5deff5955f79c845cb7848a5a7bca7e8ca0de9262273a1ce4a777ee7c5090c7f WatchSource:0}: Error finding container 5deff5955f79c845cb7848a5a7bca7e8ca0de9262273a1ce4a777ee7c5090c7f: Status 404 returned error can't find the container with id 5deff5955f79c845cb7848a5a7bca7e8ca0de9262273a1ce4a777ee7c5090c7f Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.966687 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.980371 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" event={"ID":"04125307-b213-4579-8042-92284900796b","Type":"ContainerStarted","Data":"3810422895eae394c04d341d1128429a7cf24177c1a49c9c0349e7e7c5fa4708"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.989862 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" event={"ID":"491a1079-cbfa-470e-b91b-84e323ae0c6d","Type":"ContainerStarted","Data":"06786ae24629774a0dc2a2be764632db9ec9884c67613486c9c06887960c9d07"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.992260 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nbvf4" event={"ID":"dd173309-9e96-468f-a21c-f25c86186744","Type":"ContainerStarted","Data":"4ec8d1b5827960f6bd137ff58dec19e8986121f159740a144f7e580ab0677ffa"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.993357 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" event={"ID":"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001","Type":"ContainerStarted","Data":"04efbd56e5c4592c84ed49edf78939f30a9373f93324a0e375d863fde6c3f610"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.994380 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" event={"ID":"09a0b780-3bf5-4607-9907-33e16ae4f098","Type":"ContainerStarted","Data":"5fc42ea173c194caf3bbe9755d873f9b26e7a45998273ec4e4aaf788c0cba26c"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.994466 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" event={"ID":"09a0b780-3bf5-4607-9907-33e16ae4f098","Type":"ContainerStarted","Data":"13023a5482ddf6991a949dcf22f69370474d9371fad5a2db16cd534be65c0f68"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.995610 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.996125 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.496109123 +0000 UTC m=+221.751007012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.996244 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" event={"ID":"7ba6ab17-ada9-4712-bc66-09172d648791","Type":"ContainerStarted","Data":"4c975cdc2a429849e34dce8ac1c1d4d5a27e4fa7ddde84fa00f2598f927d599d"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.997439 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" event={"ID":"ed479632-f556-407c-a8a9-b40379bbf549","Type":"ContainerStarted","Data":"e42417f0cfbbffbd4669b984cc60fe516dc68b01da288b467b4a7a0cdf8d7c49"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.001412 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" event={"ID":"bed2a913-4f7d-4a64-aed8-a510280c9b6b","Type":"ContainerStarted","Data":"b2fc80474c3305f3d579cf662d18d6f3cc4eeba384f56feddbc6fb18da99545c"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.010390 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" event={"ID":"d94eaa88-cfd0-497d-804d-922ebd316b33","Type":"ContainerStarted","Data":"322158f2c015958bb071cc6d007c875789a46a12002650e251b7a67f4fa5997f"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.014751 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" event={"ID":"da524055-8528-423f-9ccd-70198a4fbf99","Type":"ContainerStarted","Data":"c26a515962952b0dca378df9d0df683fab142453ca7aa14be72f83e3e38823fb"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.016465 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" event={"ID":"90f9a6f1-0760-4398-80cf-70c615c7032d","Type":"ContainerStarted","Data":"9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.016530 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" event={"ID":"90f9a6f1-0760-4398-80cf-70c615c7032d","Type":"ContainerStarted","Data":"4fcf975f26107b7cfd1ff1be2d34f1e281e19924c7820362af5907d5ba2ac3dc"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.016943 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.019367 4883 generic.go:334] "Generic (PLEG): container finished" podID="4210a360-cb3e-4fa8-8fd1-98217c9b00f2" containerID="dd07948b85920c5ca7cf689b80b0158f2e9b7a1ad1f4fccb474033b2a0795aa1" exitCode=0 Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.019412 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" event={"ID":"4210a360-cb3e-4fa8-8fd1-98217c9b00f2","Type":"ContainerDied","Data":"dd07948b85920c5ca7cf689b80b0158f2e9b7a1ad1f4fccb474033b2a0795aa1"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.053911 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" event={"ID":"93468e41-3e48-469f-90a9-7e05e45fe141","Type":"ContainerStarted","Data":"1c4c09e1af0a2fb9b657472a09c396cd2dc93c4029029e61abddf874ffd1ac54"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.053943 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" event={"ID":"93468e41-3e48-469f-90a9-7e05e45fe141","Type":"ContainerStarted","Data":"923cdd4f50b4442178c8bdca2b5c8bb103c402fab8b69481910326d230d89c50"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.057172 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" event={"ID":"8d9ea088-9f19-4839-bfe4-ce54842b04c2","Type":"ContainerStarted","Data":"5deff5955f79c845cb7848a5a7bca7e8ca0de9262273a1ce4a777ee7c5090c7f"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.068419 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" event={"ID":"d6acff1e-cd79-44a7-bb48-1a79857b2a97","Type":"ContainerStarted","Data":"1c4bfc5429a2aa7d8ca472e0f699f74024464d12512f6ea704c074a5215af6ad"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.078448 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" event={"ID":"313c1e2e-103d-4418-ab8d-9c1e1661f3f7","Type":"ContainerStarted","Data":"6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.078556 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" event={"ID":"313c1e2e-103d-4418-ab8d-9c1e1661f3f7","Type":"ContainerStarted","Data":"4c062e8bdd69b4e921c05bdb270d650295db96c62cdadbfa314f1f418088417e"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.078955 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.081895 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34602: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.085954 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" event={"ID":"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270","Type":"ContainerStarted","Data":"a72fce3434ade8bf0096008cd0a6271f939804ef40b3f753cb54190e4d8d1a6d"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.086948 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.088330 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" event={"ID":"854dee0a-96a6-41f9-bdbe-d0d820684605","Type":"ContainerStarted","Data":"ce7d9665ad34b2597288c408a2e16f9d20c91e90fb0d0709fc942937d2f02355"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.093866 4883 generic.go:334] "Generic (PLEG): container finished" podID="04a7ee07-f81d-4e5a-aeea-b399aa39a31c" containerID="9f50c7f85e628309d8c981873187d7b98ebd10d7e244c1a7a602685cbbbca279" exitCode=0 Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.093909 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" event={"ID":"04a7ee07-f81d-4e5a-aeea-b399aa39a31c","Type":"ContainerDied","Data":"9f50c7f85e628309d8c981873187d7b98ebd10d7e244c1a7a602685cbbbca279"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.093924 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" event={"ID":"04a7ee07-f81d-4e5a-aeea-b399aa39a31c","Type":"ContainerStarted","Data":"5fe2acbfc74cb4e318cab50fef7688fbdf7157e5096d64dd57c1d234af6fa2f6"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.097298 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.098997 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.598982889 +0000 UTC m=+221.853880778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.100726 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" event={"ID":"3de74a75-4aa1-46dd-ae5b-5c82b91811e5","Type":"ContainerStarted","Data":"73a26b0f18d24208d01caab586192f525c163e6edc9825ceaa5693d09d151dda"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.100820 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" event={"ID":"3de74a75-4aa1-46dd-ae5b-5c82b91811e5","Type":"ContainerStarted","Data":"c2d528bc35131fc5a84d816a34ae1e57e68d7a2d6cf1870754994066c9c81a85"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.147298 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" event={"ID":"04118e18-43d2-4aed-9812-aba776c0bf61","Type":"ContainerStarted","Data":"6d967f23f52a9fe345bd815805ce396d68c4d0c16172123b280d34c14caadfc7"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.147348 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" event={"ID":"04118e18-43d2-4aed-9812-aba776c0bf61","Type":"ContainerStarted","Data":"c71f34b23d0938af07832d6026cab9c92192f4492913028b07ad8294e3ab5ff9"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.148178 4883 patch_prober.go:28] interesting pod/downloads-7954f5f757-69msk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.148216 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-69msk" podUID="ca36a0b9-d7c9-4195-803b-53d41ac683d9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.175918 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.198848 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34616: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.204953 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.210923 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.710903333 +0000 UTC m=+221.965801223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.267330 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.288875 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34624: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.289226 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552226-jp7d9"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.311206 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.313592 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.813578265 +0000 UTC m=+222.068476155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.319938 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.321690 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kzsbn"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.392614 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34632: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.419108 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.419229 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.919203912 +0000 UTC m=+222.174101801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.419454 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.419780 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.91977116 +0000 UTC m=+222.174669038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.451648 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:07:15 crc kubenswrapper[4883]: W0310 09:07:15.459642 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81e48af_a943_4b68_b259_3c0685529d42.slice/crio-333212c33e5dd63b5421066ce6179e8df37301ffb9110ba9c5b0e9c5330a01b9 WatchSource:0}: Error finding container 333212c33e5dd63b5421066ce6179e8df37301ffb9110ba9c5b0e9c5330a01b9: Status 404 returned error can't find the container with id 333212c33e5dd63b5421066ce6179e8df37301ffb9110ba9c5b0e9c5330a01b9 Mar 10 09:07:15 crc kubenswrapper[4883]: W0310 09:07:15.489992 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec510e9_f96b_44da_abec_7d49115d0c83.slice/crio-53b2048ec91d16049f11459ce8cc579f19d97009bd12bb6aa206cfde30f5952b WatchSource:0}: Error finding container 53b2048ec91d16049f11459ce8cc579f19d97009bd12bb6aa206cfde30f5952b: Status 404 returned error can't find the container with id 53b2048ec91d16049f11459ce8cc579f19d97009bd12bb6aa206cfde30f5952b Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.496938 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34648: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.510688 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.521672 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.522211 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.022194657 +0000 UTC m=+222.277092547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.556090 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dh2nm"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.607092 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" podStartSLOduration=193.607069043 podStartE2EDuration="3m13.607069043s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:15.559810043 +0000 UTC m=+221.814707933" watchObservedRunningTime="2026-03-10 09:07:15.607069043 +0000 UTC m=+221.861966932" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.610040 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5nj7x"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.612518 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34652: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.615297 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.623384 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.623732 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.123716499 +0000 UTC m=+222.378614377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.661894 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" podStartSLOduration=193.661872131 podStartE2EDuration="3m13.661872131s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:15.653122712 +0000 UTC m=+221.908020601" watchObservedRunningTime="2026-03-10 09:07:15.661872131 +0000 UTC m=+221.916770021" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.724783 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.725887 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.225755584 +0000 UTC m=+222.480653483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.726382 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.726705 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.226694501 +0000 UTC m=+222.481592390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: W0310 09:07:15.739061 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbb98199_ce3b_4a19_bc11_a4c55d8e8df2.slice/crio-e4984f5d6447b59945f2f4032d8994328bdcb77ec70f7983a812672eca1f58f3 WatchSource:0}: Error finding container e4984f5d6447b59945f2f4032d8994328bdcb77ec70f7983a812672eca1f58f3: Status 404 returned error can't find the container with id e4984f5d6447b59945f2f4032d8994328bdcb77ec70f7983a812672eca1f58f3 Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.825907 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34664: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.828982 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.829404 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.329386375 +0000 UTC m=+222.584284264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.877140 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-69msk" podStartSLOduration=193.877125647 podStartE2EDuration="3m13.877125647s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:15.876142768 +0000 UTC m=+222.131040656" watchObservedRunningTime="2026-03-10 09:07:15.877125647 +0000 UTC m=+222.132023536" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.931251 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.932362 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.432346093 +0000 UTC m=+222.687243982 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.966034 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" podStartSLOduration=193.966011242 podStartE2EDuration="3m13.966011242s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:15.948390714 +0000 UTC m=+222.203288603" watchObservedRunningTime="2026-03-10 09:07:15.966011242 +0000 UTC m=+222.220909132" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.969211 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.014328 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.038350 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.038880 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.538864057 +0000 UTC m=+222.793761947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.068636 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-42rrg" podStartSLOduration=194.068614489 podStartE2EDuration="3m14.068614489s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.028794654 +0000 UTC m=+222.283692543" watchObservedRunningTime="2026-03-10 09:07:16.068614489 +0000 UTC m=+222.323512378" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.070534 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4"] Mar 10 09:07:16 crc kubenswrapper[4883]: W0310 09:07:16.085589 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod828338b4_f6a3_4a38_9596_2556459de30a.slice/crio-86718298f8d5d1fae8fd62a6bc92aeb4ee6024d977db73ae2f5f882199fca13d WatchSource:0}: Error finding container 86718298f8d5d1fae8fd62a6bc92aeb4ee6024d977db73ae2f5f882199fca13d: Status 404 returned error can't find the container with id 86718298f8d5d1fae8fd62a6bc92aeb4ee6024d977db73ae2f5f882199fca13d Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.118348 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ndt59"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.144611 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.145011 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.645000285 +0000 UTC m=+222.899898174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.206577 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nbvf4" event={"ID":"dd173309-9e96-468f-a21c-f25c86186744","Type":"ContainerStarted","Data":"2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.221662 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" event={"ID":"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7","Type":"ContainerStarted","Data":"61baee90e8b438025e934296e5a650a7588c153da10769e3db60481de3622944"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.224877 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x6pxw"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.246126 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.246584 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.746570336 +0000 UTC m=+223.001468226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.267261 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34676: no serving certificate available for the kubelet" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.267833 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" event={"ID":"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001","Type":"ContainerStarted","Data":"91aa8a80f7a942bed2a8a74ac537a8962d5fb40a52ca30f835d0a4d9536c78d8"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.300371 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" event={"ID":"7ba6ab17-ada9-4712-bc66-09172d648791","Type":"ContainerStarted","Data":"a1365eacac9555d17db63c4bb03627cf9490e95ce606ef4451640160334e1ed5"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.318865 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.326860 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" event={"ID":"828338b4-f6a3-4a38-9596-2556459de30a","Type":"ContainerStarted","Data":"86718298f8d5d1fae8fd62a6bc92aeb4ee6024d977db73ae2f5f882199fca13d"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.333992 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" event={"ID":"57fad383-2bee-48b1-b513-32a629c976aa","Type":"ContainerStarted","Data":"fe5b2595f3d0c0bff90d35e233f13461651357c92a710f6d2c67899a77075aae"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.334034 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" event={"ID":"57fad383-2bee-48b1-b513-32a629c976aa","Type":"ContainerStarted","Data":"7b8babb49c8f6284579884f142ce0d9eb11e924b3d290da9f46fb58258ccc552"} Mar 10 09:07:16 crc kubenswrapper[4883]: W0310 09:07:16.339609 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9409438_97ce_43a6_8a7f_24764925eb53.slice/crio-32009b107107518d961d11d18e34817a517e07d8c43071edaacf7cfe686e5fa6 WatchSource:0}: Error finding container 32009b107107518d961d11d18e34817a517e07d8c43071edaacf7cfe686e5fa6: Status 404 returned error can't find the container with id 32009b107107518d961d11d18e34817a517e07d8c43071edaacf7cfe686e5fa6 Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.348021 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.350104 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.850093054 +0000 UTC m=+223.104990943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.370683 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" event={"ID":"f81e48af-a943-4b68-b259-3c0685529d42","Type":"ContainerStarted","Data":"333212c33e5dd63b5421066ce6179e8df37301ffb9110ba9c5b0e9c5330a01b9"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.386918 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" event={"ID":"632d4971-be4e-4939-a46a-42604b182436","Type":"ContainerStarted","Data":"de61b8fd98c13cb0710e4560c66bd5b5056787cf78c03765e45a9dc01a3d0bf9"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.409783 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9vv9k" event={"ID":"e56425c4-e04a-4313-a946-efc4ddac49ee","Type":"ContainerStarted","Data":"8f169b8b6f5fb205e9aa9844776c9cb7bb1c7eec9edcee3d35102d75c42337fa"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.412430 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.421034 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.438432 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" event={"ID":"09a0b780-3bf5-4607-9907-33e16ae4f098","Type":"ContainerStarted","Data":"a0f3b7efe8911bf9d9034a64956d059c8fc9413cd2a119b82665786612bca0d7"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.449446 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.450612 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.950598883 +0000 UTC m=+223.205496773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.450762 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.451793 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.951785727 +0000 UTC m=+223.206683616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.485723 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" event={"ID":"bed2a913-4f7d-4a64-aed8-a510280c9b6b","Type":"ContainerStarted","Data":"d814627899b16948e8f7553b870f65a92008d7092c8bab34fb02d5a7b27d6dd3"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.497238 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.515562 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bzfz7" event={"ID":"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb","Type":"ContainerStarted","Data":"f7c83fe73e51ea54e398d880432b376da1707bc93dd19c084d22ee954413ba08"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.517958 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" event={"ID":"6e2199dc-f886-4cde-aab8-60f4e4823840","Type":"ContainerStarted","Data":"d5461303fb0081511ff9d46b0acfb4e13f7e46d3169f64d643fba42cfa571529"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.531435 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" podStartSLOduration=194.531419487 podStartE2EDuration="3m14.531419487s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.509146802 +0000 UTC m=+222.764044691" watchObservedRunningTime="2026-03-10 09:07:16.531419487 +0000 UTC m=+222.786317377" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.533076 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" event={"ID":"04125307-b213-4579-8042-92284900796b","Type":"ContainerStarted","Data":"4c5f20570e22466208fc0ff8da3ccc018d409605de288a6887be0d64a2a3364f"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.534301 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" event={"ID":"04125307-b213-4579-8042-92284900796b","Type":"ContainerStarted","Data":"d26b7504a5b7a8938a38786ee5deaca63c220462d0424a2067493e687ecf6846"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.534409 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" event={"ID":"491a1079-cbfa-470e-b91b-84e323ae0c6d","Type":"ContainerStarted","Data":"d723cb7d45e224806bacb387153460074d6ee328c83925e0765b2da14178fc7f"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.547605 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" event={"ID":"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2","Type":"ContainerStarted","Data":"e4984f5d6447b59945f2f4032d8994328bdcb77ec70f7983a812672eca1f58f3"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.551623 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.552592 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.052578266 +0000 UTC m=+223.307476154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.564171 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" event={"ID":"854dee0a-96a6-41f9-bdbe-d0d820684605","Type":"ContainerStarted","Data":"7ea2d5c378d4fff852d4edd0fe20824276dc100d9936799346d55f54776bf476"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.623316 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5nj7x" event={"ID":"459d25fc-b392-4a73-bfce-6250fc05c6e4","Type":"ContainerStarted","Data":"ccf6dd7a024cec4914a4d8426a08e5a3d685a525ec697953b9a21ccfe71f8a88"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.645666 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" event={"ID":"da524055-8528-423f-9ccd-70198a4fbf99","Type":"ContainerStarted","Data":"4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.647550 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.655720 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.658310 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.158297067 +0000 UTC m=+223.413194955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.668306 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" event={"ID":"cf87b69c-5c1e-4297-82c9-ff39bf48b628","Type":"ContainerStarted","Data":"1781318da7f0e95081973e6cbcd4b2f78521117f06ab141da6e3a67f40c29484"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.669498 4883 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-76t2f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.669533 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" podUID="da524055-8528-423f-9ccd-70198a4fbf99" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.702557 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.722503 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" podStartSLOduration=194.722468321 podStartE2EDuration="3m14.722468321s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.721595717 +0000 UTC m=+222.976493606" watchObservedRunningTime="2026-03-10 09:07:16.722468321 +0000 UTC m=+222.977366210" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.745434 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" event={"ID":"3de74a75-4aa1-46dd-ae5b-5c82b91811e5","Type":"ContainerStarted","Data":"641f35eaaa1fca530ef6bb4774bf889868ee76f008b7fba79c8a3b9564104bcb"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.757290 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.760348 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.260323718 +0000 UTC m=+223.515221606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.765212 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" event={"ID":"7ec510e9-f96b-44da-abec-7d49115d0c83","Type":"ContainerStarted","Data":"53b2048ec91d16049f11459ce8cc579f19d97009bd12bb6aa206cfde30f5952b"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.777391 4883 generic.go:334] "Generic (PLEG): container finished" podID="d94eaa88-cfd0-497d-804d-922ebd316b33" containerID="e9bc417da75478fa9167ebb9aee182ee2c4e5f0223fffec129e37b685bde91c7" exitCode=0 Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.777903 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" event={"ID":"d94eaa88-cfd0-497d-804d-922ebd316b33","Type":"ContainerDied","Data":"e9bc417da75478fa9167ebb9aee182ee2c4e5f0223fffec129e37b685bde91c7"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.784623 4883 patch_prober.go:28] interesting pod/downloads-7954f5f757-69msk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.784689 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-69msk" podUID="ca36a0b9-d7c9-4195-803b-53d41ac683d9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.817691 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nbvf4" podStartSLOduration=194.817666949 podStartE2EDuration="3m14.817666949s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.78496941 +0000 UTC m=+223.039867300" watchObservedRunningTime="2026-03-10 09:07:16.817666949 +0000 UTC m=+223.072564838" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.835046 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" podStartSLOduration=194.835010846 podStartE2EDuration="3m14.835010846s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.83061419 +0000 UTC m=+223.085512079" watchObservedRunningTime="2026-03-10 09:07:16.835010846 +0000 UTC m=+223.089908735" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.868844 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.873581 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.373549741 +0000 UTC m=+223.628447630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.882138 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" podStartSLOduration=194.882120955 podStartE2EDuration="3m14.882120955s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.87499444 +0000 UTC m=+223.129892329" watchObservedRunningTime="2026-03-10 09:07:16.882120955 +0000 UTC m=+223.137018844" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.921555 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" podStartSLOduration=194.921530738 podStartE2EDuration="3m14.921530738s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.91978405 +0000 UTC m=+223.174681929" watchObservedRunningTime="2026-03-10 09:07:16.921530738 +0000 UTC m=+223.176428627" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.962641 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" podStartSLOduration=194.962621486 podStartE2EDuration="3m14.962621486s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.961142212 +0000 UTC m=+223.216040101" watchObservedRunningTime="2026-03-10 09:07:16.962621486 +0000 UTC m=+223.217519376" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.978721 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.979241 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.479227725 +0000 UTC m=+223.734125614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.000604 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34682: no serving certificate available for the kubelet" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.092752 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.093285 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.593273961 +0000 UTC m=+223.848171851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.099604 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" podStartSLOduration=195.099587015 podStartE2EDuration="3m15.099587015s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.00270632 +0000 UTC m=+223.257604209" watchObservedRunningTime="2026-03-10 09:07:17.099587015 +0000 UTC m=+223.354484904" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.100144 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" podStartSLOduration=195.100139345 podStartE2EDuration="3m15.100139345s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.081396523 +0000 UTC m=+223.336294413" watchObservedRunningTime="2026-03-10 09:07:17.100139345 +0000 UTC m=+223.355037234" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.167490 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-bzfz7" podStartSLOduration=6.167448545 podStartE2EDuration="6.167448545s" podCreationTimestamp="2026-03-10 09:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.153127818 +0000 UTC m=+223.408025708" watchObservedRunningTime="2026-03-10 09:07:17.167448545 +0000 UTC m=+223.422346424" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.195272 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.195737 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.695721885 +0000 UTC m=+223.950619774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.261960 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" podStartSLOduration=195.261942517 podStartE2EDuration="3m15.261942517s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.254161971 +0000 UTC m=+223.509059861" watchObservedRunningTime="2026-03-10 09:07:17.261942517 +0000 UTC m=+223.516840406" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.263110 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" podStartSLOduration=195.263104003 podStartE2EDuration="3m15.263104003s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.213104337 +0000 UTC m=+223.468002225" watchObservedRunningTime="2026-03-10 09:07:17.263104003 +0000 UTC m=+223.518001892" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.298241 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.298588 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.798574142 +0000 UTC m=+224.053472021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.307461 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" podStartSLOduration=195.307442485 podStartE2EDuration="3m15.307442485s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.30431638 +0000 UTC m=+223.559214269" watchObservedRunningTime="2026-03-10 09:07:17.307442485 +0000 UTC m=+223.562340375" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.400523 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.401083 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.901061771 +0000 UTC m=+224.155959660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.401733 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" podStartSLOduration=195.401715671 podStartE2EDuration="3m15.401715671s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.399790878 +0000 UTC m=+223.654688768" watchObservedRunningTime="2026-03-10 09:07:17.401715671 +0000 UTC m=+223.656613560" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.401844 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" podStartSLOduration=195.401838503 podStartE2EDuration="3m15.401838503s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.370722561 +0000 UTC m=+223.625620450" watchObservedRunningTime="2026-03-10 09:07:17.401838503 +0000 UTC m=+223.656736392" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.448940 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.448988 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.504413 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.504753 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.004740152 +0000 UTC m=+224.259638041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.606058 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.606776 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.106761262 +0000 UTC m=+224.361659150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.609349 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.675028 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" podStartSLOduration=195.675010712 podStartE2EDuration="3m15.675010712s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.485075214 +0000 UTC m=+223.739973103" watchObservedRunningTime="2026-03-10 09:07:17.675010712 +0000 UTC m=+223.929908602" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.710596 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.711950 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.211929256 +0000 UTC m=+224.466827145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.814798 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.815524 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.315505836 +0000 UTC m=+224.570403724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.848693 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" event={"ID":"89e1c086-5372-40ce-859d-3eb64bb06012","Type":"ContainerStarted","Data":"de640c1d204a6a6bc9d8006403f61dc265c51a63476abfb40f9929189e3047c5"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.848748 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" event={"ID":"89e1c086-5372-40ce-859d-3eb64bb06012","Type":"ContainerStarted","Data":"bc1a39c52b64b8e4030fb860cfe5d517b23ce28f670bc80cdab587d0fb8973e0"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.849305 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.865335 4883 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7thqp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.865390 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" podUID="89e1c086-5372-40ce-859d-3eb64bb06012" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.876717 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" podStartSLOduration=195.876703892 podStartE2EDuration="3m15.876703892s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.874812282 +0000 UTC m=+224.129710171" watchObservedRunningTime="2026-03-10 09:07:17.876703892 +0000 UTC m=+224.131601782" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.909913 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" event={"ID":"7ec510e9-f96b-44da-abec-7d49115d0c83","Type":"ContainerStarted","Data":"b05595c1bee7c0fb7806f0fe9b4fdda6953eaeb7ccc98597a71597db47aa9201"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.919375 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.919768 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.419746494 +0000 UTC m=+224.674644382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.955089 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" event={"ID":"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7","Type":"ContainerStarted","Data":"149154dacd692520edab53ab1b6b290faad89add480249f478282a1ccb53512f"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.976504 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" event={"ID":"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9","Type":"ContainerStarted","Data":"0ba4db3001272752a5c0f3ef5737b4e8677404a186b994685a1217279d191778"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.976562 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" event={"ID":"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9","Type":"ContainerStarted","Data":"12a67716dcb0f3e0591bf818028fe9112722ac345728f4332cd4beae4bdc154b"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.977642 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.995611 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.997170 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" event={"ID":"6e2199dc-f886-4cde-aab8-60f4e4823840","Type":"ContainerStarted","Data":"5844bc14d9bbcd35e0aab47f240d022209dee44b2d46584f185ba336ec373815"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.997197 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" event={"ID":"6e2199dc-f886-4cde-aab8-60f4e4823840","Type":"ContainerStarted","Data":"c65052a843165285629e1d749e51ea6da22f549acfbdc6d218a5d058b7887eed"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.009592 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" podStartSLOduration=196.009571147 podStartE2EDuration="3m16.009571147s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.008800756 +0000 UTC m=+224.263698646" watchObservedRunningTime="2026-03-10 09:07:18.009571147 +0000 UTC m=+224.264469036" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.014136 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" event={"ID":"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2","Type":"ContainerStarted","Data":"cd73b445e62902774b57ff3e5a972ad3b51cb219983678fbd1cf0106b444b75e"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.021878 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.022878 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.52286444 +0000 UTC m=+224.777762329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.051772 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" event={"ID":"57fad383-2bee-48b1-b513-32a629c976aa","Type":"ContainerStarted","Data":"ed668a885ff69442249f69c3eacf8edf8c8d73846327f5d249e38422d1a1839c"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.070465 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" podStartSLOduration=196.07045382 podStartE2EDuration="3m16.07045382s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.069903515 +0000 UTC m=+224.324801404" watchObservedRunningTime="2026-03-10 09:07:18.07045382 +0000 UTC m=+224.325351709" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.127803 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.128949 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.628932428 +0000 UTC m=+224.883830318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.135529 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" event={"ID":"491a1079-cbfa-470e-b91b-84e323ae0c6d","Type":"ContainerStarted","Data":"0cf7088425cc1ceed196102e080795a89f935be52bb32bd59cc200e8a0be03aa"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.158993 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" event={"ID":"4210a360-cb3e-4fa8-8fd1-98217c9b00f2","Type":"ContainerStarted","Data":"c2165751bc44a713fdd66d7612b6042159fdbc505f8596dd35eb68f80753c177"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.164141 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" podStartSLOduration=196.164129152 podStartE2EDuration="3m16.164129152s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.162714679 +0000 UTC m=+224.417612568" watchObservedRunningTime="2026-03-10 09:07:18.164129152 +0000 UTC m=+224.419027040" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.164768 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" podStartSLOduration=196.164762744 podStartE2EDuration="3m16.164762744s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.103530021 +0000 UTC m=+224.358427901" watchObservedRunningTime="2026-03-10 09:07:18.164762744 +0000 UTC m=+224.419660633" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.174199 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" event={"ID":"3c05291a-8935-4f5e-81c8-4523b3b7e558","Type":"ContainerStarted","Data":"22dbd6a924198c07ada91ae2264df7ef353cbd906df2bb998720c1f7f679d6e3"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.174230 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" event={"ID":"3c05291a-8935-4f5e-81c8-4523b3b7e558","Type":"ContainerStarted","Data":"a1e81e20922b5a051ba9d09c2d31d2a47efd773a0bad99d72604af38b8e61429"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.174929 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.195884 4883 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mg7tt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.195952 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" podUID="3c05291a-8935-4f5e-81c8-4523b3b7e558" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.202692 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" podStartSLOduration=196.202674808 podStartE2EDuration="3m16.202674808s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.200574054 +0000 UTC m=+224.455471933" watchObservedRunningTime="2026-03-10 09:07:18.202674808 +0000 UTC m=+224.457572697" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.204210 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" event={"ID":"6fb9cd04-d1cb-446b-9bab-b054c51df85c","Type":"ContainerStarted","Data":"ef5d210ef7d367a4fe800288d31a24b1360c72c641186b395417e2487c44a950"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.204249 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" event={"ID":"6fb9cd04-d1cb-446b-9bab-b054c51df85c","Type":"ContainerStarted","Data":"d0fd1f1ab125bfbee59b8d2d35bde536df9636926ac723fb91c8abad3ee4e6d6"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.214674 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" event={"ID":"0be14f8e-b9d8-4058-9be3-cdc61ce88626","Type":"ContainerStarted","Data":"d15aaac8a80d44b120c58eb49e71ed5d06a04daa936e43f5e09fe3fbce5d0142"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.214706 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" event={"ID":"0be14f8e-b9d8-4058-9be3-cdc61ce88626","Type":"ContainerStarted","Data":"b2de766c7f3e1df780306fb20dfcae02ace3d5080579f4d7e5fa2a3d5480fbbb"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.231066 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.232086 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.732070061 +0000 UTC m=+224.986967950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.234543 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" event={"ID":"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270","Type":"ContainerStarted","Data":"f072ddcc741d44b14f25e0e23062c848ba9921be414e2d336c03c8991d7ca771"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.249784 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" event={"ID":"74014fb3-ee38-481a-a27f-f12ff7f2c29a","Type":"ContainerStarted","Data":"44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.249817 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" event={"ID":"74014fb3-ee38-481a-a27f-f12ff7f2c29a","Type":"ContainerStarted","Data":"ee1e42ffe97556105d0510c897a1238a2dd105fd96a60722e66b11e2fc0634b8"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.250785 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.254661 4883 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ndt59 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.254690 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.255422 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" podStartSLOduration=196.255409484 podStartE2EDuration="3m16.255409484s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.233669672 +0000 UTC m=+224.488567561" watchObservedRunningTime="2026-03-10 09:07:18.255409484 +0000 UTC m=+224.510307374" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.257068 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" podStartSLOduration=196.257061103 podStartE2EDuration="3m16.257061103s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.255825838 +0000 UTC m=+224.510723726" watchObservedRunningTime="2026-03-10 09:07:18.257061103 +0000 UTC m=+224.511958992" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.267390 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bzfz7" event={"ID":"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb","Type":"ContainerStarted","Data":"1681d12f7f409c28c6346728a6f81e7b5575f9cb9073cc8bfd674a9c57e11468"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.277949 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" event={"ID":"ed479632-f556-407c-a8a9-b40379bbf549","Type":"ContainerStarted","Data":"347b506ea4fac471db22f0d72d48d1bd6bb4df665e8acc357b7f7ebec7fc7c86"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.278520 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" podStartSLOduration=196.27850586 podStartE2EDuration="3m16.27850586s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.277402203 +0000 UTC m=+224.532300092" watchObservedRunningTime="2026-03-10 09:07:18.27850586 +0000 UTC m=+224.533403749" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.299151 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5nj7x" event={"ID":"459d25fc-b392-4a73-bfce-6250fc05c6e4","Type":"ContainerStarted","Data":"03e9fdec684fe3ef272fd36f8035817f2575a4cceab0125ea9ee75cab1747985"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.312959 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" podStartSLOduration=196.312944727 podStartE2EDuration="3m16.312944727s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.308862052 +0000 UTC m=+224.563759931" watchObservedRunningTime="2026-03-10 09:07:18.312944727 +0000 UTC m=+224.567842616" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.333350 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" podStartSLOduration=196.333327104 podStartE2EDuration="3m16.333327104s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.328734479 +0000 UTC m=+224.583632367" watchObservedRunningTime="2026-03-10 09:07:18.333327104 +0000 UTC m=+224.588224993" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.335198 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.337975 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.837962549 +0000 UTC m=+225.092860438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.359795 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34690: no serving certificate available for the kubelet" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.366709 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x6pxw" event={"ID":"e9409438-97ce-43a6-8a7f-24764925eb53","Type":"ContainerStarted","Data":"ea4c74a55ed009b35dd59b065aadde0e8ff5437953fa2f9b2f59287769940fc5"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.366748 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x6pxw" event={"ID":"e9409438-97ce-43a6-8a7f-24764925eb53","Type":"ContainerStarted","Data":"32009b107107518d961d11d18e34817a517e07d8c43071edaacf7cfe686e5fa6"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.366898 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.367576 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5nj7x" podStartSLOduration=7.367556316 podStartE2EDuration="7.367556316s" podCreationTimestamp="2026-03-10 09:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.366812465 +0000 UTC m=+224.621710355" watchObservedRunningTime="2026-03-10 09:07:18.367556316 +0000 UTC m=+224.622454205" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.398827 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" event={"ID":"04a7ee07-f81d-4e5a-aeea-b399aa39a31c","Type":"ContainerStarted","Data":"b4a95b1abde7e971c4eb667cf87a07279390514097e272219e2b2b3eed4701c6"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.428192 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9vv9k" event={"ID":"e56425c4-e04a-4313-a946-efc4ddac49ee","Type":"ContainerStarted","Data":"6d3206c5973d2478a28e08951118b229a80aeb90c9cb7aa645bf8556a70cf664"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.436558 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.438127 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.938090366 +0000 UTC m=+225.192988245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.452910 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" event={"ID":"cf87b69c-5c1e-4297-82c9-ff39bf48b628","Type":"ContainerStarted","Data":"52c146b14da7df48ffeb2a6dc9c82da3b3a44e3504c5af70d26d827396663483"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.461937 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9vv9k" podStartSLOduration=196.461924321 podStartE2EDuration="3m16.461924321s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.459969901 +0000 UTC m=+224.714867790" watchObservedRunningTime="2026-03-10 09:07:18.461924321 +0000 UTC m=+224.716822210" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.464706 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-x6pxw" podStartSLOduration=7.464697531 podStartE2EDuration="7.464697531s" podCreationTimestamp="2026-03-10 09:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.399712685 +0000 UTC m=+224.654610574" watchObservedRunningTime="2026-03-10 09:07:18.464697531 +0000 UTC m=+224.719595420" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.465749 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" event={"ID":"8d9ea088-9f19-4839-bfe4-ce54842b04c2","Type":"ContainerStarted","Data":"88fd3b28cd926a2269d5b549a9fe268da29b27e7e2d4ef1b074ba4370fdae271"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.473159 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" event={"ID":"828338b4-f6a3-4a38-9596-2556459de30a","Type":"ContainerStarted","Data":"515edad77b98ea6e3d578a7af2269583e805edc35026dddce5be92f595ab31c1"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.495354 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" podStartSLOduration=196.495339479 podStartE2EDuration="3m16.495339479s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.494065992 +0000 UTC m=+224.748963881" watchObservedRunningTime="2026-03-10 09:07:18.495339479 +0000 UTC m=+224.750237369" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.507970 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" event={"ID":"f81e48af-a943-4b68-b259-3c0685529d42","Type":"ContainerStarted","Data":"36b573f303e575f73cd357ced0e8a3184fddb367b14d95484d33c759e93cfcb4"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.508013 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" event={"ID":"f81e48af-a943-4b68-b259-3c0685529d42","Type":"ContainerStarted","Data":"7f9cf91370229057e7414a59d04c86c15b77698e4b193caf4906693923179325"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.508027 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.518788 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.543384 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.550667 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.050651277 +0000 UTC m=+225.305549166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.604378 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" podStartSLOduration=196.604351921 podStartE2EDuration="3m16.604351921s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.53366938 +0000 UTC m=+224.788567270" watchObservedRunningTime="2026-03-10 09:07:18.604351921 +0000 UTC m=+224.859249980" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.646569 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.654541 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.655041 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.656599 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.156577899 +0000 UTC m=+225.411475788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.672566 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.674314 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.681801 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:18 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:18 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:18 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.681856 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.706128 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" podStartSLOduration=196.706105229 podStartE2EDuration="3m16.706105229s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.628749596 +0000 UTC m=+224.883647485" watchObservedRunningTime="2026-03-10 09:07:18.706105229 +0000 UTC m=+224.961003117" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.757630 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.758065 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.258048013 +0000 UTC m=+225.512945902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.858583 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.858820 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.358791709 +0000 UTC m=+225.613689598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.858860 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.859375 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.35936031 +0000 UTC m=+225.614258199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.860603 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pzn5l"] Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.914881 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f"] Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.960488 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.960669 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.460644623 +0000 UTC m=+225.715542512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.960867 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.961281 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.461271584 +0000 UTC m=+225.716169473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.061683 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:19 crc kubenswrapper[4883]: E0310 09:07:19.062004 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.561990774 +0000 UTC m=+225.816888662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.163124 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:19 crc kubenswrapper[4883]: E0310 09:07:19.163520 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.663498909 +0000 UTC m=+225.918396788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.189850 4883 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.264319 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:19 crc kubenswrapper[4883]: E0310 09:07:19.264863 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.764847666 +0000 UTC m=+226.019745554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.266052 4883 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T09:07:19.189891821Z","Handler":null,"Name":""} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.269272 4883 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.269308 4883 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.365675 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.371723 4883 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.371762 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.410666 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.466694 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.480552 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.522094 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x6pxw" event={"ID":"e9409438-97ce-43a6-8a7f-24764925eb53","Type":"ContainerStarted","Data":"f8bdb05cea0bd302fba067dc42e18876c9675c6fca3bd4b1c5448fe1446c2e77"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.525201 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" event={"ID":"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7","Type":"ContainerStarted","Data":"342796ca3b40259938e8434916bb313e698fc95663ce82432b26ebecce0089be"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.525251 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" event={"ID":"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7","Type":"ContainerStarted","Data":"a5bc9dcf31d8aa2826a579e635e9181b293dce8c155590c95e0a74bb24d8f056"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.527886 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" event={"ID":"d94eaa88-cfd0-497d-804d-922ebd316b33","Type":"ContainerStarted","Data":"b1ad82f63d2a99d4c686f8077a3a5a33c60b5bdf754b5d571abea724c8d4a2d9"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.527931 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" event={"ID":"d94eaa88-cfd0-497d-804d-922ebd316b33","Type":"ContainerStarted","Data":"09858184ad8944a52a3322affd78a26bf48307d0abb44fec979288789d56074b"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.529704 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" event={"ID":"6fb9cd04-d1cb-446b-9bab-b054c51df85c","Type":"ContainerStarted","Data":"3159005aa4d5ed199258cc991c0ef348bfbd4c940d7be1c132b8c59b0c379eda"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.531890 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" event={"ID":"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2","Type":"ContainerStarted","Data":"554637a79e4a001b48f5e3b6a5c4bdf1ee7b4b7888edec4d101443ca83dc3d39"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.534829 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" podUID="313c1e2e-103d-4418-ab8d-9c1e1661f3f7" containerName="controller-manager" containerID="cri-o://6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0" gracePeriod=30 Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.535162 4883 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ndt59 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.535198 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.541789 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.541961 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.542064 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.542207 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.572245 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" podStartSLOduration=197.572226203 podStartE2EDuration="3m17.572226203s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:19.556759218 +0000 UTC m=+225.811657108" watchObservedRunningTime="2026-03-10 09:07:19.572226203 +0000 UTC m=+225.827124092" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.633908 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" podStartSLOduration=197.633889486 podStartE2EDuration="3m17.633889486s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:19.633732871 +0000 UTC m=+225.888630760" watchObservedRunningTime="2026-03-10 09:07:19.633889486 +0000 UTC m=+225.888787375" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.673649 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:19 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:19 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:19 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.673714 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.713926 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.990150 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.021286 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jv69n"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.078195 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qmdc\" (UniqueName: \"kubernetes.io/projected/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-kube-api-access-2qmdc\") pod \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.078293 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-config\") pod \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.078389 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-proxy-ca-bundles\") pod \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.078418 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-client-ca\") pod \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.078497 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-serving-cert\") pod \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.079181 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-config" (OuterVolumeSpecName: "config") pod "313c1e2e-103d-4418-ab8d-9c1e1661f3f7" (UID: "313c1e2e-103d-4418-ab8d-9c1e1661f3f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.079881 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-client-ca" (OuterVolumeSpecName: "client-ca") pod "313c1e2e-103d-4418-ab8d-9c1e1661f3f7" (UID: "313c1e2e-103d-4418-ab8d-9c1e1661f3f7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.084710 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "313c1e2e-103d-4418-ab8d-9c1e1661f3f7" (UID: "313c1e2e-103d-4418-ab8d-9c1e1661f3f7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.085060 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-kube-api-access-2qmdc" (OuterVolumeSpecName: "kube-api-access-2qmdc") pod "313c1e2e-103d-4418-ab8d-9c1e1661f3f7" (UID: "313c1e2e-103d-4418-ab8d-9c1e1661f3f7"). InnerVolumeSpecName "kube-api-access-2qmdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.085998 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "313c1e2e-103d-4418-ab8d-9c1e1661f3f7" (UID: "313c1e2e-103d-4418-ab8d-9c1e1661f3f7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.087413 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.095072 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tlhr4"] Mar 10 09:07:20 crc kubenswrapper[4883]: E0310 09:07:20.095329 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313c1e2e-103d-4418-ab8d-9c1e1661f3f7" containerName="controller-manager" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.095348 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="313c1e2e-103d-4418-ab8d-9c1e1661f3f7" containerName="controller-manager" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.095450 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="313c1e2e-103d-4418-ab8d-9c1e1661f3f7" containerName="controller-manager" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.096133 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.100813 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tlhr4"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.106672 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183440 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-utilities\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183511 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djb46\" (UniqueName: \"kubernetes.io/projected/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-kube-api-access-djb46\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183651 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-catalog-content\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183717 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183729 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183740 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183751 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qmdc\" (UniqueName: \"kubernetes.io/projected/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-kube-api-access-2qmdc\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183761 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.268252 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ltgv7"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.269172 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.272122 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.278932 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ltgv7"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.284860 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-catalog-content\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.284931 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-utilities\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.284967 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djb46\" (UniqueName: \"kubernetes.io/projected/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-kube-api-access-djb46\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.285345 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-catalog-content\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.285938 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-utilities\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.311280 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djb46\" (UniqueName: \"kubernetes.io/projected/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-kube-api-access-djb46\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.386441 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-catalog-content\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.386531 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-utilities\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.386560 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5m4n\" (UniqueName: \"kubernetes.io/projected/816c3b00-c481-4c08-9691-0244d3c044e3-kube-api-access-p5m4n\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.464817 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cwsth"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.465798 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.465937 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.476044 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cwsth"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.487635 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-catalog-content\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.487706 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-utilities\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.487732 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5m4n\" (UniqueName: \"kubernetes.io/projected/816c3b00-c481-4c08-9691-0244d3c044e3-kube-api-access-p5m4n\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.488434 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-utilities\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.488694 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-catalog-content\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.506361 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5m4n\" (UniqueName: \"kubernetes.io/projected/816c3b00-c481-4c08-9691-0244d3c044e3-kube-api-access-p5m4n\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.543127 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" event={"ID":"7bfdcb1d-e416-438c-9916-5c42cf35f2eb","Type":"ContainerStarted","Data":"7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431"} Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.543239 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" event={"ID":"7bfdcb1d-e416-438c-9916-5c42cf35f2eb","Type":"ContainerStarted","Data":"ba9d597cdd4e690659606d934bb4d1fb3e310147327af93f1ac8149f438281d6"} Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.543281 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.547020 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" event={"ID":"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7","Type":"ContainerStarted","Data":"13258bb7a6eba84c564ff7253911568d80abdde2123bae7297828fc1d9a80d59"} Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.548718 4883 generic.go:334] "Generic (PLEG): container finished" podID="313c1e2e-103d-4418-ab8d-9c1e1661f3f7" containerID="6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0" exitCode=0 Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.548946 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.548969 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" event={"ID":"313c1e2e-103d-4418-ab8d-9c1e1661f3f7","Type":"ContainerDied","Data":"6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0"} Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.548998 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" event={"ID":"313c1e2e-103d-4418-ab8d-9c1e1661f3f7","Type":"ContainerDied","Data":"4c062e8bdd69b4e921c05bdb270d650295db96c62cdadbfa314f1f418088417e"} Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.549020 4883 scope.go:117] "RemoveContainer" containerID="6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.552845 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" podUID="da524055-8528-423f-9ccd-70198a4fbf99" containerName="route-controller-manager" containerID="cri-o://4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8" gracePeriod=30 Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.554287 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.563551 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" podStartSLOduration=198.563512095 podStartE2EDuration="3m18.563512095s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:20.558222197 +0000 UTC m=+226.813120086" watchObservedRunningTime="2026-03-10 09:07:20.563512095 +0000 UTC m=+226.818409984" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.577151 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" podStartSLOduration=9.577139516 podStartE2EDuration="9.577139516s" podCreationTimestamp="2026-03-10 09:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:20.574860847 +0000 UTC m=+226.829758737" watchObservedRunningTime="2026-03-10 09:07:20.577139516 +0000 UTC m=+226.832037405" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.577858 4883 scope.go:117] "RemoveContainer" containerID="6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.580365 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: E0310 09:07:20.580910 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0\": container with ID starting with 6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0 not found: ID does not exist" containerID="6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.580951 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0"} err="failed to get container status \"6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0\": rpc error: code = NotFound desc = could not find container \"6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0\": container with ID starting with 6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0 not found: ID does not exist" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.589279 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-utilities\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.589347 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-catalog-content\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.589368 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz6z4\" (UniqueName: \"kubernetes.io/projected/f98b0611-9639-4987-9e9b-0e1c4695a164-kube-api-access-cz6z4\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.600020 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pzn5l"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.602856 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pzn5l"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.665740 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dtqw6"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.667016 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.668499 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:20 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:20 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:20 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.668560 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.673000 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tlhr4"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.676544 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtqw6"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.691546 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-utilities\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.691726 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz6z4\" (UniqueName: \"kubernetes.io/projected/f98b0611-9639-4987-9e9b-0e1c4695a164-kube-api-access-cz6z4\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.691770 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-catalog-content\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.705089 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-catalog-content\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.705400 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-utilities\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: W0310 09:07:20.716863 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa724d40_49c8_4d1d_a7e9_5af8f0603e19.slice/crio-bb7cd6f6d9fdd71abf19936362d7bad0ceebe476125419e2660408cc85f60192 WatchSource:0}: Error finding container bb7cd6f6d9fdd71abf19936362d7bad0ceebe476125419e2660408cc85f60192: Status 404 returned error can't find the container with id bb7cd6f6d9fdd71abf19936362d7bad0ceebe476125419e2660408cc85f60192 Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.737806 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz6z4\" (UniqueName: \"kubernetes.io/projected/f98b0611-9639-4987-9e9b-0e1c4695a164-kube-api-access-cz6z4\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.793589 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-utilities\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.793765 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9hhp\" (UniqueName: \"kubernetes.io/projected/804d1679-c6e1-4594-b067-c41da8ee64ab-kube-api-access-v9hhp\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.793916 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-catalog-content\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.801520 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.895739 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-catalog-content\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.896039 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-utilities\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.896113 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9hhp\" (UniqueName: \"kubernetes.io/projected/804d1679-c6e1-4594-b067-c41da8ee64ab-kube-api-access-v9hhp\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.896304 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-catalog-content\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.896535 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-utilities\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.919535 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.924342 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9hhp\" (UniqueName: \"kubernetes.io/projected/804d1679-c6e1-4594-b067-c41da8ee64ab-kube-api-access-v9hhp\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.937664 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b79978d66-7m8kr"] Mar 10 09:07:20 crc kubenswrapper[4883]: E0310 09:07:20.937950 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da524055-8528-423f-9ccd-70198a4fbf99" containerName="route-controller-manager" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.937962 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="da524055-8528-423f-9ccd-70198a4fbf99" containerName="route-controller-manager" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.938084 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="da524055-8528-423f-9ccd-70198a4fbf99" containerName="route-controller-manager" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.940113 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.944088 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.944373 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.944506 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.944943 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.945058 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.948429 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b79978d66-7m8kr"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.949536 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.951070 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.952351 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34704: no serving certificate available for the kubelet" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.988852 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997304 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da524055-8528-423f-9ccd-70198a4fbf99-serving-cert\") pod \"da524055-8528-423f-9ccd-70198a4fbf99\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997368 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdplz\" (UniqueName: \"kubernetes.io/projected/da524055-8528-423f-9ccd-70198a4fbf99-kube-api-access-sdplz\") pod \"da524055-8528-423f-9ccd-70198a4fbf99\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997466 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-config\") pod \"da524055-8528-423f-9ccd-70198a4fbf99\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997623 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-client-ca\") pod \"da524055-8528-423f-9ccd-70198a4fbf99\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997825 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-proxy-ca-bundles\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997868 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-config\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997892 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbwc2\" (UniqueName: \"kubernetes.io/projected/625af2a2-6c46-49c9-90bd-0730adfcf9a8-kube-api-access-zbwc2\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997988 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-client-ca\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.998109 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/625af2a2-6c46-49c9-90bd-0730adfcf9a8-serving-cert\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.998433 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-config" (OuterVolumeSpecName: "config") pod "da524055-8528-423f-9ccd-70198a4fbf99" (UID: "da524055-8528-423f-9ccd-70198a4fbf99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.998460 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-client-ca" (OuterVolumeSpecName: "client-ca") pod "da524055-8528-423f-9ccd-70198a4fbf99" (UID: "da524055-8528-423f-9ccd-70198a4fbf99"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.001178 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da524055-8528-423f-9ccd-70198a4fbf99-kube-api-access-sdplz" (OuterVolumeSpecName: "kube-api-access-sdplz") pod "da524055-8528-423f-9ccd-70198a4fbf99" (UID: "da524055-8528-423f-9ccd-70198a4fbf99"). InnerVolumeSpecName "kube-api-access-sdplz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.001380 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da524055-8528-423f-9ccd-70198a4fbf99-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "da524055-8528-423f-9ccd-70198a4fbf99" (UID: "da524055-8528-423f-9ccd-70198a4fbf99"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.038150 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cwsth"] Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.040832 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ltgv7"] Mar 10 09:07:21 crc kubenswrapper[4883]: W0310 09:07:21.044921 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf98b0611_9639_4987_9e9b_0e1c4695a164.slice/crio-d460ae50500ac6d093c81ce23834fe61bddb693cf03ca9139dc421ae40806219 WatchSource:0}: Error finding container d460ae50500ac6d093c81ce23834fe61bddb693cf03ca9139dc421ae40806219: Status 404 returned error can't find the container with id d460ae50500ac6d093c81ce23834fe61bddb693cf03ca9139dc421ae40806219 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099402 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/625af2a2-6c46-49c9-90bd-0730adfcf9a8-serving-cert\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099488 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-proxy-ca-bundles\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099515 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-config\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099545 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbwc2\" (UniqueName: \"kubernetes.io/projected/625af2a2-6c46-49c9-90bd-0730adfcf9a8-kube-api-access-zbwc2\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099589 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-client-ca\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099639 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da524055-8528-423f-9ccd-70198a4fbf99-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099656 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdplz\" (UniqueName: \"kubernetes.io/projected/da524055-8528-423f-9ccd-70198a4fbf99-kube-api-access-sdplz\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099669 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099679 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.100664 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-client-ca\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.100849 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-proxy-ca-bundles\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.101898 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-config\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.102684 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/625af2a2-6c46-49c9-90bd-0730adfcf9a8-serving-cert\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.117982 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbwc2\" (UniqueName: \"kubernetes.io/projected/625af2a2-6c46-49c9-90bd-0730adfcf9a8-kube-api-access-zbwc2\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.165873 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtqw6"] Mar 10 09:07:21 crc kubenswrapper[4883]: W0310 09:07:21.179754 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804d1679_c6e1_4594_b067_c41da8ee64ab.slice/crio-7fa6f7eaccd4ad67921504c2ac2d68f6a5036f11199646ded41154976ac1c2dc WatchSource:0}: Error finding container 7fa6f7eaccd4ad67921504c2ac2d68f6a5036f11199646ded41154976ac1c2dc: Status 404 returned error can't find the container with id 7fa6f7eaccd4ad67921504c2ac2d68f6a5036f11199646ded41154976ac1c2dc Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.274691 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.485281 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b79978d66-7m8kr"] Mar 10 09:07:21 crc kubenswrapper[4883]: W0310 09:07:21.496747 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod625af2a2_6c46_49c9_90bd_0730adfcf9a8.slice/crio-39ca8bd84b869eeb8c74d1ff18bd08cab3d284caa5f72a4f7529690086692a85 WatchSource:0}: Error finding container 39ca8bd84b869eeb8c74d1ff18bd08cab3d284caa5f72a4f7529690086692a85: Status 404 returned error can't find the container with id 39ca8bd84b869eeb8c74d1ff18bd08cab3d284caa5f72a4f7529690086692a85 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.561320 4883 generic.go:334] "Generic (PLEG): container finished" podID="da524055-8528-423f-9ccd-70198a4fbf99" containerID="4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8" exitCode=0 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.561377 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" event={"ID":"da524055-8528-423f-9ccd-70198a4fbf99","Type":"ContainerDied","Data":"4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.561408 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" event={"ID":"da524055-8528-423f-9ccd-70198a4fbf99","Type":"ContainerDied","Data":"c26a515962952b0dca378df9d0df683fab142453ca7aa14be72f83e3e38823fb"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.561428 4883 scope.go:117] "RemoveContainer" containerID="4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.561566 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.564214 4883 generic.go:334] "Generic (PLEG): container finished" podID="0be14f8e-b9d8-4058-9be3-cdc61ce88626" containerID="d15aaac8a80d44b120c58eb49e71ed5d06a04daa936e43f5e09fe3fbce5d0142" exitCode=0 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.564284 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" event={"ID":"0be14f8e-b9d8-4058-9be3-cdc61ce88626","Type":"ContainerDied","Data":"d15aaac8a80d44b120c58eb49e71ed5d06a04daa936e43f5e09fe3fbce5d0142"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.566558 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" event={"ID":"625af2a2-6c46-49c9-90bd-0730adfcf9a8","Type":"ContainerStarted","Data":"39ca8bd84b869eeb8c74d1ff18bd08cab3d284caa5f72a4f7529690086692a85"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.569700 4883 generic.go:334] "Generic (PLEG): container finished" podID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerID="09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd" exitCode=0 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.569762 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwsth" event={"ID":"f98b0611-9639-4987-9e9b-0e1c4695a164","Type":"ContainerDied","Data":"09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.569784 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwsth" event={"ID":"f98b0611-9639-4987-9e9b-0e1c4695a164","Type":"ContainerStarted","Data":"d460ae50500ac6d093c81ce23834fe61bddb693cf03ca9139dc421ae40806219"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.572164 4883 generic.go:334] "Generic (PLEG): container finished" podID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerID="fbcddb1697b17646b99e5eb8976196f07f60605511e41f78c1cc131392bba69c" exitCode=0 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.572550 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtqw6" event={"ID":"804d1679-c6e1-4594-b067-c41da8ee64ab","Type":"ContainerDied","Data":"fbcddb1697b17646b99e5eb8976196f07f60605511e41f78c1cc131392bba69c"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.572603 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtqw6" event={"ID":"804d1679-c6e1-4594-b067-c41da8ee64ab","Type":"ContainerStarted","Data":"7fa6f7eaccd4ad67921504c2ac2d68f6a5036f11199646ded41154976ac1c2dc"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.575456 4883 generic.go:334] "Generic (PLEG): container finished" podID="816c3b00-c481-4c08-9691-0244d3c044e3" containerID="a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656" exitCode=0 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.575530 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltgv7" event={"ID":"816c3b00-c481-4c08-9691-0244d3c044e3","Type":"ContainerDied","Data":"a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.575972 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltgv7" event={"ID":"816c3b00-c481-4c08-9691-0244d3c044e3","Type":"ContainerStarted","Data":"f5c87addac3f89a4858d25eb6fa3c57863872b10777952494e3f153096638f60"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.581636 4883 generic.go:334] "Generic (PLEG): container finished" podID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerID="0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722" exitCode=0 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.581702 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerDied","Data":"0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.581722 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerStarted","Data":"bb7cd6f6d9fdd71abf19936362d7bad0ceebe476125419e2660408cc85f60192"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.598975 4883 scope.go:117] "RemoveContainer" containerID="4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8" Mar 10 09:07:21 crc kubenswrapper[4883]: E0310 09:07:21.600788 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8\": container with ID starting with 4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8 not found: ID does not exist" containerID="4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.600825 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8"} err="failed to get container status \"4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8\": rpc error: code = NotFound desc = could not find container \"4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8\": container with ID starting with 4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8 not found: ID does not exist" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.638286 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f"] Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.641286 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f"] Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.663770 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:21 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:21 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:21 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.663828 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.888937 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.889871 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.891687 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.893681 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.899039 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.015564 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83795b7e-47fe-45a0-85fe-63e8b880ddae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.015685 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83795b7e-47fe-45a0-85fe-63e8b880ddae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.046724 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.047424 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.048719 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.049435 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.059360 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.086072 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="313c1e2e-103d-4418-ab8d-9c1e1661f3f7" path="/var/lib/kubelet/pods/313c1e2e-103d-4418-ab8d-9c1e1661f3f7/volumes" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.086762 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da524055-8528-423f-9ccd-70198a4fbf99" path="/var/lib/kubelet/pods/da524055-8528-423f-9ccd-70198a4fbf99/volumes" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.117088 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83795b7e-47fe-45a0-85fe-63e8b880ddae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.117159 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1e38226-07ed-488a-b501-b3aeacb94bc6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.117188 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1e38226-07ed-488a-b501-b3aeacb94bc6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.117244 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83795b7e-47fe-45a0-85fe-63e8b880ddae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.117501 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83795b7e-47fe-45a0-85fe-63e8b880ddae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.135718 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83795b7e-47fe-45a0-85fe-63e8b880ddae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.203090 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.219239 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1e38226-07ed-488a-b501-b3aeacb94bc6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.219281 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1e38226-07ed-488a-b501-b3aeacb94bc6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.219370 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1e38226-07ed-488a-b501-b3aeacb94bc6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.233715 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1e38226-07ed-488a-b501-b3aeacb94bc6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.274688 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2h5dv"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.277941 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.279140 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h5dv"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.281085 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.361670 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.423002 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85hl4\" (UniqueName: \"kubernetes.io/projected/7746695d-3e1f-455d-9acc-dffdba42c0d5-kube-api-access-85hl4\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.423045 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-utilities\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.423071 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-catalog-content\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.525240 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85hl4\" (UniqueName: \"kubernetes.io/projected/7746695d-3e1f-455d-9acc-dffdba42c0d5-kube-api-access-85hl4\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.525296 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-utilities\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.525316 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-catalog-content\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.525915 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-utilities\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.525946 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-catalog-content\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.551301 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85hl4\" (UniqueName: \"kubernetes.io/projected/7746695d-3e1f-455d-9acc-dffdba42c0d5-kube-api-access-85hl4\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.589926 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" event={"ID":"625af2a2-6c46-49c9-90bd-0730adfcf9a8","Type":"ContainerStarted","Data":"7bce35fe57df60a11c326c0ddd8ec6eab1f979bc0ca526968c4c7e1e91df682e"} Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.590148 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.594609 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.595807 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.608318 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" podStartSLOduration=3.60829939 podStartE2EDuration="3.60829939s" podCreationTimestamp="2026-03-10 09:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:22.603294828 +0000 UTC m=+228.858192718" watchObservedRunningTime="2026-03-10 09:07:22.60829939 +0000 UTC m=+228.863197279" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.664744 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p22dp"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.664752 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:22 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:22 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:22 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.664927 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.665888 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.678163 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p22dp"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.729831 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-catalog-content\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.730243 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqv7r\" (UniqueName: \"kubernetes.io/projected/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-kube-api-access-lqv7r\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.730368 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-utilities\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.832004 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-catalog-content\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.832056 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqv7r\" (UniqueName: \"kubernetes.io/projected/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-kube-api-access-lqv7r\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.832107 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-utilities\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.832835 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-utilities\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.833110 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-catalog-content\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.848404 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqv7r\" (UniqueName: \"kubernetes.io/projected/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-kube-api-access-lqv7r\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.937081 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.938243 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.943884 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.947068 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.947310 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.947398 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.947551 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.947544 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.953858 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.984011 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.037322 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58qdv\" (UniqueName: \"kubernetes.io/projected/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-kube-api-access-58qdv\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.037367 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-config\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.037402 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-serving-cert\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.037421 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-client-ca\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.139498 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58qdv\" (UniqueName: \"kubernetes.io/projected/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-kube-api-access-58qdv\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.139572 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-config\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.139610 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-serving-cert\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.139634 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-client-ca\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.140776 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-client-ca\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.141048 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-config\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.145574 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-serving-cert\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.156359 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58qdv\" (UniqueName: \"kubernetes.io/projected/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-kube-api-access-58qdv\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.261337 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.273722 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vhnvt"] Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.276038 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.279567 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.282755 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhnvt"] Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.356589 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-utilities\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.356712 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-catalog-content\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.356750 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvxh5\" (UniqueName: \"kubernetes.io/projected/740631be-94cf-4c75-a5a3-0dbd57e2e510-kube-api-access-xvxh5\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.458250 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvxh5\" (UniqueName: \"kubernetes.io/projected/740631be-94cf-4c75-a5a3-0dbd57e2e510-kube-api-access-xvxh5\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.458367 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-utilities\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.458493 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-catalog-content\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.458953 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-catalog-content\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.459112 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-utilities\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.473359 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvxh5\" (UniqueName: \"kubernetes.io/projected/740631be-94cf-4c75-a5a3-0dbd57e2e510-kube-api-access-xvxh5\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.592122 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.663964 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:23 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:23 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:23 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.664033 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.669549 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j5rwl"] Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.671297 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.684227 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j5rwl"] Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.712013 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.762192 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-utilities\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.762270 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfh75\" (UniqueName: \"kubernetes.io/projected/504544f8-69a4-4562-87d5-fa61335ea052-kube-api-access-zfh75\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.762387 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-catalog-content\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.863237 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-catalog-content\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.863331 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-utilities\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.863365 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfh75\" (UniqueName: \"kubernetes.io/projected/504544f8-69a4-4562-87d5-fa61335ea052-kube-api-access-zfh75\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.864524 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-catalog-content\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.864556 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-utilities\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.878998 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfh75\" (UniqueName: \"kubernetes.io/projected/504544f8-69a4-4562-87d5-fa61335ea052-kube-api-access-zfh75\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.976530 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.976611 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.978848 4883 patch_prober.go:28] interesting pod/console-f9d7485db-nbvf4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.978907 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nbvf4" podUID="dd173309-9e96-468f-a21c-f25c86186744" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.989355 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.156295 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.156349 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.162651 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.340013 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.475616 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be14f8e-b9d8-4058-9be3-cdc61ce88626-config-volume\") pod \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.475705 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7txw4\" (UniqueName: \"kubernetes.io/projected/0be14f8e-b9d8-4058-9be3-cdc61ce88626-kube-api-access-7txw4\") pod \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.475801 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be14f8e-b9d8-4058-9be3-cdc61ce88626-secret-volume\") pod \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.476833 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0be14f8e-b9d8-4058-9be3-cdc61ce88626-config-volume" (OuterVolumeSpecName: "config-volume") pod "0be14f8e-b9d8-4058-9be3-cdc61ce88626" (UID: "0be14f8e-b9d8-4058-9be3-cdc61ce88626"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.481098 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be14f8e-b9d8-4058-9be3-cdc61ce88626-kube-api-access-7txw4" (OuterVolumeSpecName: "kube-api-access-7txw4") pod "0be14f8e-b9d8-4058-9be3-cdc61ce88626" (UID: "0be14f8e-b9d8-4058-9be3-cdc61ce88626"). InnerVolumeSpecName "kube-api-access-7txw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.487529 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be14f8e-b9d8-4058-9be3-cdc61ce88626-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0be14f8e-b9d8-4058-9be3-cdc61ce88626" (UID: "0be14f8e-b9d8-4058-9be3-cdc61ce88626"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.577820 4883 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be14f8e-b9d8-4058-9be3-cdc61ce88626-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.577851 4883 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be14f8e-b9d8-4058-9be3-cdc61ce88626-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.577860 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7txw4\" (UniqueName: \"kubernetes.io/projected/0be14f8e-b9d8-4058-9be3-cdc61ce88626-kube-api-access-7txw4\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.611755 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" event={"ID":"0be14f8e-b9d8-4058-9be3-cdc61ce88626","Type":"ContainerDied","Data":"b2de766c7f3e1df780306fb20dfcae02ace3d5080579f4d7e5fa2a3d5480fbbb"} Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.611794 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.611811 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2de766c7f3e1df780306fb20dfcae02ace3d5080579f4d7e5fa2a3d5480fbbb" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.615674 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.663664 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.668538 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:24 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:24 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:24 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.668595 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:25 crc kubenswrapper[4883]: I0310 09:07:25.663013 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:25 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:25 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:25 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:25 crc kubenswrapper[4883]: I0310 09:07:25.663322 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:26 crc kubenswrapper[4883]: I0310 09:07:26.031714 4883 ???:1] "http: TLS handshake error from 192.168.126.11:53968: no serving certificate available for the kubelet" Mar 10 09:07:26 crc kubenswrapper[4883]: I0310 09:07:26.133031 4883 ???:1] "http: TLS handshake error from 192.168.126.11:53980: no serving certificate available for the kubelet" Mar 10 09:07:26 crc kubenswrapper[4883]: I0310 09:07:26.662493 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:26 crc kubenswrapper[4883]: I0310 09:07:26.665315 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.143202 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j5rwl"] Mar 10 09:07:27 crc kubenswrapper[4883]: W0310 09:07:27.165582 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod504544f8_69a4_4562_87d5_fa61335ea052.slice/crio-a0d2e9ecb5144e13f35ee0d106385391edc44733dcd6e7ac3cc5e5bac91f227d WatchSource:0}: Error finding container a0d2e9ecb5144e13f35ee0d106385391edc44733dcd6e7ac3cc5e5bac91f227d: Status 404 returned error can't find the container with id a0d2e9ecb5144e13f35ee0d106385391edc44733dcd6e7ac3cc5e5bac91f227d Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.187731 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhnvt"] Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.300691 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p22dp"] Mar 10 09:07:27 crc kubenswrapper[4883]: W0310 09:07:27.318940 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cbe9069_9970_4e7d_a2ec_d563c6b46a1c.slice/crio-04c01d96f1c8b1e9164cb1cc2f2e5455c16dd9c69f66112c9edab2362be5bcac WatchSource:0}: Error finding container 04c01d96f1c8b1e9164cb1cc2f2e5455c16dd9c69f66112c9edab2362be5bcac: Status 404 returned error can't find the container with id 04c01d96f1c8b1e9164cb1cc2f2e5455c16dd9c69f66112c9edab2362be5bcac Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.560866 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.621010 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl"] Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.627332 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h5dv"] Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.633659 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 09:07:27 crc kubenswrapper[4883]: W0310 09:07:27.645571 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf1e38226_07ed_488a_b501_b3aeacb94bc6.slice/crio-ee77aab60e91d1804cd4e1ed82a9bd9cd852e5b8963eeb759c6fb7df9b3cfe58 WatchSource:0}: Error finding container ee77aab60e91d1804cd4e1ed82a9bd9cd852e5b8963eeb759c6fb7df9b3cfe58: Status 404 returned error can't find the container with id ee77aab60e91d1804cd4e1ed82a9bd9cd852e5b8963eeb759c6fb7df9b3cfe58 Mar 10 09:07:27 crc kubenswrapper[4883]: W0310 09:07:27.646914 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59e9f1e6_cc9a_45a0_9a03_e3b1526b5783.slice/crio-2d8f88178da3a29c5d0e4f5892aa2df9d27ee9b9f94aacf88d6eef5d1defa2ab WatchSource:0}: Error finding container 2d8f88178da3a29c5d0e4f5892aa2df9d27ee9b9f94aacf88d6eef5d1defa2ab: Status 404 returned error can't find the container with id 2d8f88178da3a29c5d0e4f5892aa2df9d27ee9b9f94aacf88d6eef5d1defa2ab Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.656426 4883 generic.go:334] "Generic (PLEG): container finished" podID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerID="b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001" exitCode=0 Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.656540 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerDied","Data":"b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.656613 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerStarted","Data":"04c01d96f1c8b1e9164cb1cc2f2e5455c16dd9c69f66112c9edab2362be5bcac"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.661293 4883 generic.go:334] "Generic (PLEG): container finished" podID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerID="50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7" exitCode=0 Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.661356 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhnvt" event={"ID":"740631be-94cf-4c75-a5a3-0dbd57e2e510","Type":"ContainerDied","Data":"50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.661385 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhnvt" event={"ID":"740631be-94cf-4c75-a5a3-0dbd57e2e510","Type":"ContainerStarted","Data":"7617eec4e807a31ae8dae401f57247ee0d7df593c7506b5c96f9dc3caf16e27a"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.664843 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" event={"ID":"632d4971-be4e-4939-a46a-42604b182436","Type":"ContainerStarted","Data":"4f14915d4656a1c4b614a36f6c062aeb515069a6fa76ae902ddf80e353fe48d5"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.667314 4883 generic.go:334] "Generic (PLEG): container finished" podID="504544f8-69a4-4562-87d5-fa61335ea052" containerID="d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11" exitCode=0 Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.667419 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5rwl" event={"ID":"504544f8-69a4-4562-87d5-fa61335ea052","Type":"ContainerDied","Data":"d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.667446 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5rwl" event={"ID":"504544f8-69a4-4562-87d5-fa61335ea052","Type":"ContainerStarted","Data":"a0d2e9ecb5144e13f35ee0d106385391edc44733dcd6e7ac3cc5e5bac91f227d"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.735200 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" podStartSLOduration=76.149756017 podStartE2EDuration="1m27.735183725s" podCreationTimestamp="2026-03-10 09:06:00 +0000 UTC" firstStartedPulling="2026-03-10 09:07:15.451600344 +0000 UTC m=+221.706498232" lastFinishedPulling="2026-03-10 09:07:27.037028051 +0000 UTC m=+233.291925940" observedRunningTime="2026-03-10 09:07:27.730421801 +0000 UTC m=+233.985319689" watchObservedRunningTime="2026-03-10 09:07:27.735183725 +0000 UTC m=+233.990081613" Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.937730 4883 csr.go:261] certificate signing request csr-hb9cx is approved, waiting to be issued Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.937888 4883 csr.go:257] certificate signing request csr-hb9cx is issued Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.689746 4883 generic.go:334] "Generic (PLEG): container finished" podID="f1e38226-07ed-488a-b501-b3aeacb94bc6" containerID="31f6e5116ed55f5b8d1843edebd1e3733bcdea144efa8c0f68bdfcaf678a7f01" exitCode=0 Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.690091 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f1e38226-07ed-488a-b501-b3aeacb94bc6","Type":"ContainerDied","Data":"31f6e5116ed55f5b8d1843edebd1e3733bcdea144efa8c0f68bdfcaf678a7f01"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.690176 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f1e38226-07ed-488a-b501-b3aeacb94bc6","Type":"ContainerStarted","Data":"ee77aab60e91d1804cd4e1ed82a9bd9cd852e5b8963eeb759c6fb7df9b3cfe58"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.694893 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" event={"ID":"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783","Type":"ContainerStarted","Data":"0313c94c278ef3dce737ae5cde52e50a38c39b491a4d01a221eda83cbceca9d1"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.694933 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" event={"ID":"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783","Type":"ContainerStarted","Data":"2d8f88178da3a29c5d0e4f5892aa2df9d27ee9b9f94aacf88d6eef5d1defa2ab"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.695344 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.697224 4883 generic.go:334] "Generic (PLEG): container finished" podID="83795b7e-47fe-45a0-85fe-63e8b880ddae" containerID="21f3fa430d885af5b7d8003d9010231094e6b0abcc772af52da4a0423d3fc2c7" exitCode=0 Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.697316 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"83795b7e-47fe-45a0-85fe-63e8b880ddae","Type":"ContainerDied","Data":"21f3fa430d885af5b7d8003d9010231094e6b0abcc772af52da4a0423d3fc2c7"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.697364 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"83795b7e-47fe-45a0-85fe-63e8b880ddae","Type":"ContainerStarted","Data":"2b6592679e3f9885f3d9a38472029972a6fc2dbf38719c4a43a188dbe997b60d"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.700449 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.701183 4883 generic.go:334] "Generic (PLEG): container finished" podID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerID="8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df" exitCode=0 Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.701250 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h5dv" event={"ID":"7746695d-3e1f-455d-9acc-dffdba42c0d5","Type":"ContainerDied","Data":"8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.701275 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h5dv" event={"ID":"7746695d-3e1f-455d-9acc-dffdba42c0d5","Type":"ContainerStarted","Data":"7cd4d72ef0244e1c6f3955303b46c7d75041bd13eacfaf569a15ddb645d99b32"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.704925 4883 generic.go:334] "Generic (PLEG): container finished" podID="632d4971-be4e-4939-a46a-42604b182436" containerID="4f14915d4656a1c4b614a36f6c062aeb515069a6fa76ae902ddf80e353fe48d5" exitCode=0 Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.704965 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" event={"ID":"632d4971-be4e-4939-a46a-42604b182436","Type":"ContainerDied","Data":"4f14915d4656a1c4b614a36f6c062aeb515069a6fa76ae902ddf80e353fe48d5"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.744900 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" podStartSLOduration=9.744745208 podStartE2EDuration="9.744745208s" podCreationTimestamp="2026-03-10 09:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:28.740701585 +0000 UTC m=+234.995599495" watchObservedRunningTime="2026-03-10 09:07:28.744745208 +0000 UTC m=+234.999643096" Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.940437 4883 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-14 05:43:17.032041194 +0000 UTC Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.940494 4883 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6692h35m48.091571161s for next certificate rotation Mar 10 09:07:29 crc kubenswrapper[4883]: I0310 09:07:29.862904 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:29 crc kubenswrapper[4883]: I0310 09:07:29.941570 4883 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 12:39:00.080315664 +0000 UTC Mar 10 09:07:29 crc kubenswrapper[4883]: I0310 09:07:29.941605 4883 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6435h31m30.138712163s for next certificate rotation Mar 10 09:07:30 crc kubenswrapper[4883]: I0310 09:07:30.864300 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:30 crc kubenswrapper[4883]: I0310 09:07:30.984788 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkbdd\" (UniqueName: \"kubernetes.io/projected/632d4971-be4e-4939-a46a-42604b182436-kube-api-access-wkbdd\") pod \"632d4971-be4e-4939-a46a-42604b182436\" (UID: \"632d4971-be4e-4939-a46a-42604b182436\") " Mar 10 09:07:30 crc kubenswrapper[4883]: I0310 09:07:30.990774 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632d4971-be4e-4939-a46a-42604b182436-kube-api-access-wkbdd" (OuterVolumeSpecName: "kube-api-access-wkbdd") pod "632d4971-be4e-4939-a46a-42604b182436" (UID: "632d4971-be4e-4939-a46a-42604b182436"). InnerVolumeSpecName "kube-api-access-wkbdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.086708 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkbdd\" (UniqueName: \"kubernetes.io/projected/632d4971-be4e-4939-a46a-42604b182436-kube-api-access-wkbdd\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.191756 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.195703 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.288955 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83795b7e-47fe-45a0-85fe-63e8b880ddae-kubelet-dir\") pod \"83795b7e-47fe-45a0-85fe-63e8b880ddae\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289011 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1e38226-07ed-488a-b501-b3aeacb94bc6-kube-api-access\") pod \"f1e38226-07ed-488a-b501-b3aeacb94bc6\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289054 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83795b7e-47fe-45a0-85fe-63e8b880ddae-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "83795b7e-47fe-45a0-85fe-63e8b880ddae" (UID: "83795b7e-47fe-45a0-85fe-63e8b880ddae"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289231 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83795b7e-47fe-45a0-85fe-63e8b880ddae-kube-api-access\") pod \"83795b7e-47fe-45a0-85fe-63e8b880ddae\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289283 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1e38226-07ed-488a-b501-b3aeacb94bc6-kubelet-dir\") pod \"f1e38226-07ed-488a-b501-b3aeacb94bc6\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289437 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1e38226-07ed-488a-b501-b3aeacb94bc6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f1e38226-07ed-488a-b501-b3aeacb94bc6" (UID: "f1e38226-07ed-488a-b501-b3aeacb94bc6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289769 4883 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83795b7e-47fe-45a0-85fe-63e8b880ddae-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289788 4883 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1e38226-07ed-488a-b501-b3aeacb94bc6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.294241 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83795b7e-47fe-45a0-85fe-63e8b880ddae-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "83795b7e-47fe-45a0-85fe-63e8b880ddae" (UID: "83795b7e-47fe-45a0-85fe-63e8b880ddae"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.294966 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1e38226-07ed-488a-b501-b3aeacb94bc6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f1e38226-07ed-488a-b501-b3aeacb94bc6" (UID: "f1e38226-07ed-488a-b501-b3aeacb94bc6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.391200 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83795b7e-47fe-45a0-85fe-63e8b880ddae-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.391226 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1e38226-07ed-488a-b501-b3aeacb94bc6-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.735221 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.735309 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f1e38226-07ed-488a-b501-b3aeacb94bc6","Type":"ContainerDied","Data":"ee77aab60e91d1804cd4e1ed82a9bd9cd852e5b8963eeb759c6fb7df9b3cfe58"} Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.735359 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee77aab60e91d1804cd4e1ed82a9bd9cd852e5b8963eeb759c6fb7df9b3cfe58" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.741096 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"83795b7e-47fe-45a0-85fe-63e8b880ddae","Type":"ContainerDied","Data":"2b6592679e3f9885f3d9a38472029972a6fc2dbf38719c4a43a188dbe997b60d"} Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.741142 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b6592679e3f9885f3d9a38472029972a6fc2dbf38719c4a43a188dbe997b60d" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.741166 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.745238 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" event={"ID":"632d4971-be4e-4939-a46a-42604b182436","Type":"ContainerDied","Data":"de61b8fd98c13cb0710e4560c66bd5b5056787cf78c03765e45a9dc01a3d0bf9"} Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.745286 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de61b8fd98c13cb0710e4560c66bd5b5056787cf78c03765e45a9dc01a3d0bf9" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.745360 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:33 crc kubenswrapper[4883]: I0310 09:07:33.981930 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:33 crc kubenswrapper[4883]: I0310 09:07:33.988942 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.534020 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b79978d66-7m8kr"] Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.535272 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" podUID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" containerName="controller-manager" containerID="cri-o://7bce35fe57df60a11c326c0ddd8ec6eab1f979bc0ca526968c4c7e1e91df682e" gracePeriod=30 Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.548887 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl"] Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.549125 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" podUID="59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" containerName="route-controller-manager" containerID="cri-o://0313c94c278ef3dce737ae5cde52e50a38c39b491a4d01a221eda83cbceca9d1" gracePeriod=30 Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.800709 4883 generic.go:334] "Generic (PLEG): container finished" podID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" containerID="7bce35fe57df60a11c326c0ddd8ec6eab1f979bc0ca526968c4c7e1e91df682e" exitCode=0 Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.800808 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" event={"ID":"625af2a2-6c46-49c9-90bd-0730adfcf9a8","Type":"ContainerDied","Data":"7bce35fe57df60a11c326c0ddd8ec6eab1f979bc0ca526968c4c7e1e91df682e"} Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.803415 4883 generic.go:334] "Generic (PLEG): container finished" podID="59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" containerID="0313c94c278ef3dce737ae5cde52e50a38c39b491a4d01a221eda83cbceca9d1" exitCode=0 Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.803527 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" event={"ID":"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783","Type":"ContainerDied","Data":"0313c94c278ef3dce737ae5cde52e50a38c39b491a4d01a221eda83cbceca9d1"} Mar 10 09:07:39 crc kubenswrapper[4883]: I0310 09:07:39.719715 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.275926 4883 patch_prober.go:28] interesting pod/controller-manager-b79978d66-7m8kr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.276380 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" podUID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.318373 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340062 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx"] Mar 10 09:07:41 crc kubenswrapper[4883]: E0310 09:07:41.340298 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83795b7e-47fe-45a0-85fe-63e8b880ddae" containerName="pruner" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340312 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="83795b7e-47fe-45a0-85fe-63e8b880ddae" containerName="pruner" Mar 10 09:07:41 crc kubenswrapper[4883]: E0310 09:07:41.340321 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" containerName="route-controller-manager" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340327 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" containerName="route-controller-manager" Mar 10 09:07:41 crc kubenswrapper[4883]: E0310 09:07:41.340337 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be14f8e-b9d8-4058-9be3-cdc61ce88626" containerName="collect-profiles" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340342 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be14f8e-b9d8-4058-9be3-cdc61ce88626" containerName="collect-profiles" Mar 10 09:07:41 crc kubenswrapper[4883]: E0310 09:07:41.340353 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e38226-07ed-488a-b501-b3aeacb94bc6" containerName="pruner" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340358 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e38226-07ed-488a-b501-b3aeacb94bc6" containerName="pruner" Mar 10 09:07:41 crc kubenswrapper[4883]: E0310 09:07:41.340365 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632d4971-be4e-4939-a46a-42604b182436" containerName="oc" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340371 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="632d4971-be4e-4939-a46a-42604b182436" containerName="oc" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340460 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" containerName="route-controller-manager" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340494 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be14f8e-b9d8-4058-9be3-cdc61ce88626" containerName="collect-profiles" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340504 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="632d4971-be4e-4939-a46a-42604b182436" containerName="oc" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340511 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="83795b7e-47fe-45a0-85fe-63e8b880ddae" containerName="pruner" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340520 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1e38226-07ed-488a-b501-b3aeacb94bc6" containerName="pruner" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340986 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.348461 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx"] Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.471566 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-client-ca\") pod \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.471747 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-serving-cert\") pod \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.471783 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58qdv\" (UniqueName: \"kubernetes.io/projected/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-kube-api-access-58qdv\") pod \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.471849 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-config\") pod \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.472059 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqsg2\" (UniqueName: \"kubernetes.io/projected/48a61604-9f49-4b8f-8534-707af35c4667-kube-api-access-nqsg2\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.472167 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-client-ca\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.472190 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-config\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.472225 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a61604-9f49-4b8f-8534-707af35c4667-serving-cert\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.472339 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-client-ca" (OuterVolumeSpecName: "client-ca") pod "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" (UID: "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.472426 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-config" (OuterVolumeSpecName: "config") pod "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" (UID: "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.477023 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-kube-api-access-58qdv" (OuterVolumeSpecName: "kube-api-access-58qdv") pod "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" (UID: "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783"). InnerVolumeSpecName "kube-api-access-58qdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.478248 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" (UID: "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.573186 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-client-ca\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.574012 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-client-ca\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.574093 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-config\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.574909 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-config\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.574964 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a61604-9f49-4b8f-8534-707af35c4667-serving-cert\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.575875 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqsg2\" (UniqueName: \"kubernetes.io/projected/48a61604-9f49-4b8f-8534-707af35c4667-kube-api-access-nqsg2\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.576637 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.576678 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.576757 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.576854 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58qdv\" (UniqueName: \"kubernetes.io/projected/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-kube-api-access-58qdv\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.580496 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a61604-9f49-4b8f-8534-707af35c4667-serving-cert\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.589609 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqsg2\" (UniqueName: \"kubernetes.io/projected/48a61604-9f49-4b8f-8534-707af35c4667-kube-api-access-nqsg2\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.661062 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.831261 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" event={"ID":"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783","Type":"ContainerDied","Data":"2d8f88178da3a29c5d0e4f5892aa2df9d27ee9b9f94aacf88d6eef5d1defa2ab"} Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.831585 4883 scope.go:117] "RemoveContainer" containerID="0313c94c278ef3dce737ae5cde52e50a38c39b491a4d01a221eda83cbceca9d1" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.831314 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.856710 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl"] Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.859639 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl"] Mar 10 09:07:42 crc kubenswrapper[4883]: I0310 09:07:42.102316 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" path="/var/lib/kubelet/pods/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783/volumes" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.449724 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.450122 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.667950 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.697183 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-546cd7f689-ltpnp"] Mar 10 09:07:47 crc kubenswrapper[4883]: E0310 09:07:47.697523 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" containerName="controller-manager" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.697545 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" containerName="controller-manager" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.697669 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" containerName="controller-manager" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.698147 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.700794 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-546cd7f689-ltpnp"] Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.767889 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-client-ca\") pod \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.767945 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/625af2a2-6c46-49c9-90bd-0730adfcf9a8-serving-cert\") pod \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.767987 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-config\") pod \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.768098 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbwc2\" (UniqueName: \"kubernetes.io/projected/625af2a2-6c46-49c9-90bd-0730adfcf9a8-kube-api-access-zbwc2\") pod \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.768150 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-proxy-ca-bundles\") pod \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.769546 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "625af2a2-6c46-49c9-90bd-0730adfcf9a8" (UID: "625af2a2-6c46-49c9-90bd-0730adfcf9a8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.769574 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-client-ca" (OuterVolumeSpecName: "client-ca") pod "625af2a2-6c46-49c9-90bd-0730adfcf9a8" (UID: "625af2a2-6c46-49c9-90bd-0730adfcf9a8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.770509 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-config" (OuterVolumeSpecName: "config") pod "625af2a2-6c46-49c9-90bd-0730adfcf9a8" (UID: "625af2a2-6c46-49c9-90bd-0730adfcf9a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.776102 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/625af2a2-6c46-49c9-90bd-0730adfcf9a8-kube-api-access-zbwc2" (OuterVolumeSpecName: "kube-api-access-zbwc2") pod "625af2a2-6c46-49c9-90bd-0730adfcf9a8" (UID: "625af2a2-6c46-49c9-90bd-0730adfcf9a8"). InnerVolumeSpecName "kube-api-access-zbwc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.776254 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/625af2a2-6c46-49c9-90bd-0730adfcf9a8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "625af2a2-6c46-49c9-90bd-0730adfcf9a8" (UID: "625af2a2-6c46-49c9-90bd-0730adfcf9a8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.876433 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-config\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877034 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fct8x\" (UniqueName: \"kubernetes.io/projected/219a69fb-a146-4034-b934-3f1f8f81b338-kube-api-access-fct8x\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877090 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a69fb-a146-4034-b934-3f1f8f81b338-serving-cert\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877127 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-proxy-ca-bundles\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877194 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-client-ca\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877250 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877262 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/625af2a2-6c46-49c9-90bd-0730adfcf9a8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877272 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877285 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbwc2\" (UniqueName: \"kubernetes.io/projected/625af2a2-6c46-49c9-90bd-0730adfcf9a8-kube-api-access-zbwc2\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877294 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.919865 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerStarted","Data":"4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6"} Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.924922 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerStarted","Data":"168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c"} Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.926735 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" event={"ID":"625af2a2-6c46-49c9-90bd-0730adfcf9a8","Type":"ContainerDied","Data":"39ca8bd84b869eeb8c74d1ff18bd08cab3d284caa5f72a4f7529690086692a85"} Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.926775 4883 scope.go:117] "RemoveContainer" containerID="7bce35fe57df60a11c326c0ddd8ec6eab1f979bc0ca526968c4c7e1e91df682e" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.926896 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.978536 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fct8x\" (UniqueName: \"kubernetes.io/projected/219a69fb-a146-4034-b934-3f1f8f81b338-kube-api-access-fct8x\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.978598 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a69fb-a146-4034-b934-3f1f8f81b338-serving-cert\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.978631 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-proxy-ca-bundles\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.978680 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-client-ca\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.978717 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-config\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.979988 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-config\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.981955 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-client-ca\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.982665 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-proxy-ca-bundles\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.985933 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b79978d66-7m8kr"] Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.989792 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b79978d66-7m8kr"] Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.996079 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a69fb-a146-4034-b934-3f1f8f81b338-serving-cert\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.999579 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fct8x\" (UniqueName: \"kubernetes.io/projected/219a69fb-a146-4034-b934-3f1f8f81b338-kube-api-access-fct8x\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.030761 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.097177 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" path="/var/lib/kubelet/pods/625af2a2-6c46-49c9-90bd-0730adfcf9a8/volumes" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.109618 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx"] Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.290790 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-546cd7f689-ltpnp"] Mar 10 09:07:48 crc kubenswrapper[4883]: W0310 09:07:48.423516 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod219a69fb_a146_4034_b934_3f1f8f81b338.slice/crio-7e450ce69bf6209d6d7b73d0ae6cf77e018fd9c7d10014b6d95fe66587dd6667 WatchSource:0}: Error finding container 7e450ce69bf6209d6d7b73d0ae6cf77e018fd9c7d10014b6d95fe66587dd6667: Status 404 returned error can't find the container with id 7e450ce69bf6209d6d7b73d0ae6cf77e018fd9c7d10014b6d95fe66587dd6667 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.942707 4883 generic.go:334] "Generic (PLEG): container finished" podID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerID="4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.942874 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerDied","Data":"4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.947761 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" event={"ID":"48a61604-9f49-4b8f-8534-707af35c4667","Type":"ContainerStarted","Data":"a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.947808 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" event={"ID":"48a61604-9f49-4b8f-8534-707af35c4667","Type":"ContainerStarted","Data":"545697c18b61099ee4c8abb7b405fe27097c321bcbe2376ab05f34b5b5edc3c0"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.949058 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.954797 4883 generic.go:334] "Generic (PLEG): container finished" podID="504544f8-69a4-4562-87d5-fa61335ea052" containerID="f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.954861 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5rwl" event={"ID":"504544f8-69a4-4562-87d5-fa61335ea052","Type":"ContainerDied","Data":"f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.958010 4883 generic.go:334] "Generic (PLEG): container finished" podID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerID="4b273f81ff4709aae90e59a96f5a8a9a4b8f566fb0b34b2f69e7956ff48dc97f" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.958062 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtqw6" event={"ID":"804d1679-c6e1-4594-b067-c41da8ee64ab","Type":"ContainerDied","Data":"4b273f81ff4709aae90e59a96f5a8a9a4b8f566fb0b34b2f69e7956ff48dc97f"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.960536 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.963633 4883 generic.go:334] "Generic (PLEG): container finished" podID="816c3b00-c481-4c08-9691-0244d3c044e3" containerID="a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.963695 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltgv7" event={"ID":"816c3b00-c481-4c08-9691-0244d3c044e3","Type":"ContainerDied","Data":"a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.969275 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" event={"ID":"219a69fb-a146-4034-b934-3f1f8f81b338","Type":"ContainerStarted","Data":"a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.969303 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" event={"ID":"219a69fb-a146-4034-b934-3f1f8f81b338","Type":"ContainerStarted","Data":"7e450ce69bf6209d6d7b73d0ae6cf77e018fd9c7d10014b6d95fe66587dd6667"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.969726 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.972106 4883 generic.go:334] "Generic (PLEG): container finished" podID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerID="168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.972268 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerDied","Data":"168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.975236 4883 generic.go:334] "Generic (PLEG): container finished" podID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerID="db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.975291 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhnvt" event={"ID":"740631be-94cf-4c75-a5a3-0dbd57e2e510","Type":"ContainerDied","Data":"db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.975986 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.981262 4883 generic.go:334] "Generic (PLEG): container finished" podID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerID="055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.981406 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h5dv" event={"ID":"7746695d-3e1f-455d-9acc-dffdba42c0d5","Type":"ContainerDied","Data":"055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.984416 4883 generic.go:334] "Generic (PLEG): container finished" podID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerID="e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.984459 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwsth" event={"ID":"f98b0611-9639-4987-9e9b-0e1c4695a164","Type":"ContainerDied","Data":"e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050"} Mar 10 09:07:49 crc kubenswrapper[4883]: I0310 09:07:49.009915 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" podStartSLOduration=11.009885949 podStartE2EDuration="11.009885949s" podCreationTimestamp="2026-03-10 09:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:49.007993466 +0000 UTC m=+255.262891355" watchObservedRunningTime="2026-03-10 09:07:49.009885949 +0000 UTC m=+255.264783858" Mar 10 09:07:49 crc kubenswrapper[4883]: I0310 09:07:49.048639 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" podStartSLOduration=11.048626512 podStartE2EDuration="11.048626512s" podCreationTimestamp="2026-03-10 09:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:49.048589362 +0000 UTC m=+255.303487250" watchObservedRunningTime="2026-03-10 09:07:49.048626512 +0000 UTC m=+255.303524401" Mar 10 09:07:49 crc kubenswrapper[4883]: I0310 09:07:49.993746 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhnvt" event={"ID":"740631be-94cf-4c75-a5a3-0dbd57e2e510","Type":"ContainerStarted","Data":"c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40"} Mar 10 09:07:49 crc kubenswrapper[4883]: I0310 09:07:49.997615 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h5dv" event={"ID":"7746695d-3e1f-455d-9acc-dffdba42c0d5","Type":"ContainerStarted","Data":"88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125"} Mar 10 09:07:49 crc kubenswrapper[4883]: I0310 09:07:49.999884 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwsth" event={"ID":"f98b0611-9639-4987-9e9b-0e1c4695a164","Type":"ContainerStarted","Data":"3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28"} Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.002498 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5rwl" event={"ID":"504544f8-69a4-4562-87d5-fa61335ea052","Type":"ContainerStarted","Data":"a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942"} Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.004751 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtqw6" event={"ID":"804d1679-c6e1-4594-b067-c41da8ee64ab","Type":"ContainerStarted","Data":"a4dabd119299b8b7187e538e04ec43348adce3c73e526c74a61e41615d67afab"} Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.006692 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltgv7" event={"ID":"816c3b00-c481-4c08-9691-0244d3c044e3","Type":"ContainerStarted","Data":"91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777"} Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.008493 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerStarted","Data":"f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce"} Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.011108 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerStarted","Data":"38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff"} Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.020265 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vhnvt" podStartSLOduration=5.161998165 podStartE2EDuration="27.02025427s" podCreationTimestamp="2026-03-10 09:07:23 +0000 UTC" firstStartedPulling="2026-03-10 09:07:27.682707786 +0000 UTC m=+233.937605675" lastFinishedPulling="2026-03-10 09:07:49.54096389 +0000 UTC m=+255.795861780" observedRunningTime="2026-03-10 09:07:50.017622846 +0000 UTC m=+256.272520735" watchObservedRunningTime="2026-03-10 09:07:50.02025427 +0000 UTC m=+256.275152158" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.056239 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ltgv7" podStartSLOduration=2.093811966 podStartE2EDuration="30.056227554s" podCreationTimestamp="2026-03-10 09:07:20 +0000 UTC" firstStartedPulling="2026-03-10 09:07:21.599209394 +0000 UTC m=+227.854107284" lastFinishedPulling="2026-03-10 09:07:49.561624982 +0000 UTC m=+255.816522872" observedRunningTime="2026-03-10 09:07:50.055107436 +0000 UTC m=+256.310005324" watchObservedRunningTime="2026-03-10 09:07:50.056227554 +0000 UTC m=+256.311125443" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.056753 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p22dp" podStartSLOduration=6.241165868 podStartE2EDuration="28.056748866s" podCreationTimestamp="2026-03-10 09:07:22 +0000 UTC" firstStartedPulling="2026-03-10 09:07:27.666673374 +0000 UTC m=+233.921571273" lastFinishedPulling="2026-03-10 09:07:49.482256382 +0000 UTC m=+255.737154271" observedRunningTime="2026-03-10 09:07:50.039823336 +0000 UTC m=+256.294721225" watchObservedRunningTime="2026-03-10 09:07:50.056748866 +0000 UTC m=+256.311646755" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.075309 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dtqw6" podStartSLOduration=2.121823455 podStartE2EDuration="30.075293994s" podCreationTimestamp="2026-03-10 09:07:20 +0000 UTC" firstStartedPulling="2026-03-10 09:07:21.599645405 +0000 UTC m=+227.854543294" lastFinishedPulling="2026-03-10 09:07:49.553115945 +0000 UTC m=+255.808013833" observedRunningTime="2026-03-10 09:07:50.071378504 +0000 UTC m=+256.326276392" watchObservedRunningTime="2026-03-10 09:07:50.075293994 +0000 UTC m=+256.330191883" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.088930 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cwsth" podStartSLOduration=2.102009377 podStartE2EDuration="30.088916787s" podCreationTimestamp="2026-03-10 09:07:20 +0000 UTC" firstStartedPulling="2026-03-10 09:07:21.59911154 +0000 UTC m=+227.854009429" lastFinishedPulling="2026-03-10 09:07:49.58601895 +0000 UTC m=+255.840916839" observedRunningTime="2026-03-10 09:07:50.085219097 +0000 UTC m=+256.340116986" watchObservedRunningTime="2026-03-10 09:07:50.088916787 +0000 UTC m=+256.343814667" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.102879 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tlhr4" podStartSLOduration=2.227598451 podStartE2EDuration="30.102859903s" podCreationTimestamp="2026-03-10 09:07:20 +0000 UTC" firstStartedPulling="2026-03-10 09:07:21.599576486 +0000 UTC m=+227.854474375" lastFinishedPulling="2026-03-10 09:07:49.474837937 +0000 UTC m=+255.729735827" observedRunningTime="2026-03-10 09:07:50.100542612 +0000 UTC m=+256.355440501" watchObservedRunningTime="2026-03-10 09:07:50.102859903 +0000 UTC m=+256.357757792" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.133940 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j5rwl" podStartSLOduration=5.249434762 podStartE2EDuration="27.133926061s" podCreationTimestamp="2026-03-10 09:07:23 +0000 UTC" firstStartedPulling="2026-03-10 09:07:27.693413758 +0000 UTC m=+233.948311647" lastFinishedPulling="2026-03-10 09:07:49.577905057 +0000 UTC m=+255.832802946" observedRunningTime="2026-03-10 09:07:50.117624736 +0000 UTC m=+256.372522626" watchObservedRunningTime="2026-03-10 09:07:50.133926061 +0000 UTC m=+256.388823951" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.466460 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.466536 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.581487 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.581541 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.802373 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.802729 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.989462 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.989809 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:51 crc kubenswrapper[4883]: I0310 09:07:51.552714 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-tlhr4" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="registry-server" probeResult="failure" output=< Mar 10 09:07:51 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:07:51 crc kubenswrapper[4883]: > Mar 10 09:07:51 crc kubenswrapper[4883]: I0310 09:07:51.612605 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-ltgv7" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="registry-server" probeResult="failure" output=< Mar 10 09:07:51 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:07:51 crc kubenswrapper[4883]: > Mar 10 09:07:51 crc kubenswrapper[4883]: I0310 09:07:51.832062 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-cwsth" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="registry-server" probeResult="failure" output=< Mar 10 09:07:51 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:07:51 crc kubenswrapper[4883]: > Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.020210 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dtqw6" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="registry-server" probeResult="failure" output=< Mar 10 09:07:52 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:07:52 crc kubenswrapper[4883]: > Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.596029 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.596192 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.635318 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.652637 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2h5dv" podStartSLOduration=11.992897582 podStartE2EDuration="30.652621734s" podCreationTimestamp="2026-03-10 09:07:22 +0000 UTC" firstStartedPulling="2026-03-10 09:07:30.83190895 +0000 UTC m=+237.086806839" lastFinishedPulling="2026-03-10 09:07:49.491633102 +0000 UTC m=+255.746530991" observedRunningTime="2026-03-10 09:07:50.13617275 +0000 UTC m=+256.391070639" watchObservedRunningTime="2026-03-10 09:07:52.652621734 +0000 UTC m=+258.907519623" Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.984861 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.984993 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:53 crc kubenswrapper[4883]: I0310 09:07:53.017355 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:53 crc kubenswrapper[4883]: I0310 09:07:53.592621 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:53 crc kubenswrapper[4883]: I0310 09:07:53.592677 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:53 crc kubenswrapper[4883]: I0310 09:07:53.990229 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:53 crc kubenswrapper[4883]: I0310 09:07:53.990286 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:54 crc kubenswrapper[4883]: I0310 09:07:54.068412 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:54 crc kubenswrapper[4883]: I0310 09:07:54.068533 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:54 crc kubenswrapper[4883]: I0310 09:07:54.470905 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:54 crc kubenswrapper[4883]: I0310 09:07:54.623549 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vhnvt" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="registry-server" probeResult="failure" output=< Mar 10 09:07:54 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:07:54 crc kubenswrapper[4883]: > Mar 10 09:07:55 crc kubenswrapper[4883]: I0310 09:07:55.021746 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j5rwl" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="registry-server" probeResult="failure" output=< Mar 10 09:07:55 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:07:55 crc kubenswrapper[4883]: > Mar 10 09:07:56 crc kubenswrapper[4883]: I0310 09:07:56.549001 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p22dp"] Mar 10 09:07:56 crc kubenswrapper[4883]: I0310 09:07:56.549553 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p22dp" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="registry-server" containerID="cri-o://38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff" gracePeriod=2 Mar 10 09:07:56 crc kubenswrapper[4883]: I0310 09:07:56.943754 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.012585 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-utilities\") pod \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.013365 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-utilities" (OuterVolumeSpecName: "utilities") pod "9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" (UID: "9cbe9069-9970-4e7d-a2ec-d563c6b46a1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.049013 4883 generic.go:334] "Generic (PLEG): container finished" podID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerID="38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff" exitCode=0 Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.049040 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.049052 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerDied","Data":"38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff"} Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.049084 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerDied","Data":"04c01d96f1c8b1e9164cb1cc2f2e5455c16dd9c69f66112c9edab2362be5bcac"} Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.049114 4883 scope.go:117] "RemoveContainer" containerID="38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.066378 4883 scope.go:117] "RemoveContainer" containerID="168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.078714 4883 scope.go:117] "RemoveContainer" containerID="b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.097598 4883 scope.go:117] "RemoveContainer" containerID="38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff" Mar 10 09:07:57 crc kubenswrapper[4883]: E0310 09:07:57.097986 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff\": container with ID starting with 38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff not found: ID does not exist" containerID="38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.098035 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff"} err="failed to get container status \"38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff\": rpc error: code = NotFound desc = could not find container \"38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff\": container with ID starting with 38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff not found: ID does not exist" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.098064 4883 scope.go:117] "RemoveContainer" containerID="168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c" Mar 10 09:07:57 crc kubenswrapper[4883]: E0310 09:07:57.098452 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c\": container with ID starting with 168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c not found: ID does not exist" containerID="168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.098500 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c"} err="failed to get container status \"168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c\": rpc error: code = NotFound desc = could not find container \"168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c\": container with ID starting with 168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c not found: ID does not exist" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.098517 4883 scope.go:117] "RemoveContainer" containerID="b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001" Mar 10 09:07:57 crc kubenswrapper[4883]: E0310 09:07:57.098805 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001\": container with ID starting with b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001 not found: ID does not exist" containerID="b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.098826 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001"} err="failed to get container status \"b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001\": rpc error: code = NotFound desc = could not find container \"b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001\": container with ID starting with b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001 not found: ID does not exist" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.113440 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqv7r\" (UniqueName: \"kubernetes.io/projected/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-kube-api-access-lqv7r\") pod \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.113530 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-catalog-content\") pod \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.113914 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.120849 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-kube-api-access-lqv7r" (OuterVolumeSpecName: "kube-api-access-lqv7r") pod "9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" (UID: "9cbe9069-9970-4e7d-a2ec-d563c6b46a1c"). InnerVolumeSpecName "kube-api-access-lqv7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.140061 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" (UID: "9cbe9069-9970-4e7d-a2ec-d563c6b46a1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.214892 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqv7r\" (UniqueName: \"kubernetes.io/projected/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-kube-api-access-lqv7r\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.215196 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.295292 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 09:07:57 crc kubenswrapper[4883]: E0310 09:07:57.296008 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="registry-server" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.296031 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="registry-server" Mar 10 09:07:57 crc kubenswrapper[4883]: E0310 09:07:57.296057 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="extract-content" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.296064 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="extract-content" Mar 10 09:07:57 crc kubenswrapper[4883]: E0310 09:07:57.296086 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="extract-utilities" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.296096 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="extract-utilities" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.296359 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="registry-server" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.297304 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.301172 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.301494 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.319008 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.321536 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68a11b02-0067-46ed-84fe-c764e88b2810-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.321587 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68a11b02-0067-46ed-84fe-c764e88b2810-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.375535 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p22dp"] Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.376349 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p22dp"] Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.423971 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68a11b02-0067-46ed-84fe-c764e88b2810-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.424026 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68a11b02-0067-46ed-84fe-c764e88b2810-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.424129 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68a11b02-0067-46ed-84fe-c764e88b2810-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.443120 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68a11b02-0067-46ed-84fe-c764e88b2810-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.630576 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.003289 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.058308 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"68a11b02-0067-46ed-84fe-c764e88b2810","Type":"ContainerStarted","Data":"617b5bb81242e03395057fe95d456d123903cf5a9eb30f13bfcebeef10f8dca9"} Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.086594 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" path="/var/lib/kubelet/pods/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c/volumes" Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.492641 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-546cd7f689-ltpnp"] Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.493112 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" podUID="219a69fb-a146-4034-b934-3f1f8f81b338" containerName="controller-manager" containerID="cri-o://a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6" gracePeriod=30 Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.520461 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx"] Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.520748 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" podUID="48a61604-9f49-4b8f-8534-707af35c4667" containerName="route-controller-manager" containerID="cri-o://a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011" gracePeriod=30 Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.953194 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.011466 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.044168 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-config\") pod \"48a61604-9f49-4b8f-8534-707af35c4667\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.044950 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-config" (OuterVolumeSpecName: "config") pod "48a61604-9f49-4b8f-8534-707af35c4667" (UID: "48a61604-9f49-4b8f-8534-707af35c4667"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.066178 4883 generic.go:334] "Generic (PLEG): container finished" podID="219a69fb-a146-4034-b934-3f1f8f81b338" containerID="a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6" exitCode=0 Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.066246 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" event={"ID":"219a69fb-a146-4034-b934-3f1f8f81b338","Type":"ContainerDied","Data":"a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6"} Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.066280 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" event={"ID":"219a69fb-a146-4034-b934-3f1f8f81b338","Type":"ContainerDied","Data":"7e450ce69bf6209d6d7b73d0ae6cf77e018fd9c7d10014b6d95fe66587dd6667"} Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.066298 4883 scope.go:117] "RemoveContainer" containerID="a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.066421 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.070720 4883 generic.go:334] "Generic (PLEG): container finished" podID="68a11b02-0067-46ed-84fe-c764e88b2810" containerID="5ea3a1acee75ae8a56c0a060d638503d84669497c3c2aebe6a60cd9076f24524" exitCode=0 Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.070799 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"68a11b02-0067-46ed-84fe-c764e88b2810","Type":"ContainerDied","Data":"5ea3a1acee75ae8a56c0a060d638503d84669497c3c2aebe6a60cd9076f24524"} Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.072232 4883 generic.go:334] "Generic (PLEG): container finished" podID="48a61604-9f49-4b8f-8534-707af35c4667" containerID="a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011" exitCode=0 Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.072263 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" event={"ID":"48a61604-9f49-4b8f-8534-707af35c4667","Type":"ContainerDied","Data":"a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011"} Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.072454 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" event={"ID":"48a61604-9f49-4b8f-8534-707af35c4667","Type":"ContainerDied","Data":"545697c18b61099ee4c8abb7b405fe27097c321bcbe2376ab05f34b5b5edc3c0"} Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.072268 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.083009 4883 scope.go:117] "RemoveContainer" containerID="a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6" Mar 10 09:07:59 crc kubenswrapper[4883]: E0310 09:07:59.083305 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6\": container with ID starting with a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6 not found: ID does not exist" containerID="a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.083341 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6"} err="failed to get container status \"a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6\": rpc error: code = NotFound desc = could not find container \"a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6\": container with ID starting with a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6 not found: ID does not exist" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.083363 4883 scope.go:117] "RemoveContainer" containerID="a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.101767 4883 scope.go:117] "RemoveContainer" containerID="a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011" Mar 10 09:07:59 crc kubenswrapper[4883]: E0310 09:07:59.102055 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011\": container with ID starting with a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011 not found: ID does not exist" containerID="a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.102087 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011"} err="failed to get container status \"a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011\": rpc error: code = NotFound desc = could not find container \"a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011\": container with ID starting with a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011 not found: ID does not exist" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.145527 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-client-ca\") pod \"219a69fb-a146-4034-b934-3f1f8f81b338\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.146504 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-client-ca" (OuterVolumeSpecName: "client-ca") pod "219a69fb-a146-4034-b934-3f1f8f81b338" (UID: "219a69fb-a146-4034-b934-3f1f8f81b338"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.146582 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-proxy-ca-bundles\") pod \"219a69fb-a146-4034-b934-3f1f8f81b338\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.146606 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-config\") pod \"219a69fb-a146-4034-b934-3f1f8f81b338\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147080 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "219a69fb-a146-4034-b934-3f1f8f81b338" (UID: "219a69fb-a146-4034-b934-3f1f8f81b338"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147114 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-client-ca\") pod \"48a61604-9f49-4b8f-8534-707af35c4667\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147152 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-client-ca" (OuterVolumeSpecName: "client-ca") pod "48a61604-9f49-4b8f-8534-707af35c4667" (UID: "48a61604-9f49-4b8f-8534-707af35c4667"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147168 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a61604-9f49-4b8f-8534-707af35c4667-serving-cert\") pod \"48a61604-9f49-4b8f-8534-707af35c4667\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147228 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fct8x\" (UniqueName: \"kubernetes.io/projected/219a69fb-a146-4034-b934-3f1f8f81b338-kube-api-access-fct8x\") pod \"219a69fb-a146-4034-b934-3f1f8f81b338\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147254 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqsg2\" (UniqueName: \"kubernetes.io/projected/48a61604-9f49-4b8f-8534-707af35c4667-kube-api-access-nqsg2\") pod \"48a61604-9f49-4b8f-8534-707af35c4667\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147289 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a69fb-a146-4034-b934-3f1f8f81b338-serving-cert\") pod \"219a69fb-a146-4034-b934-3f1f8f81b338\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147284 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-config" (OuterVolumeSpecName: "config") pod "219a69fb-a146-4034-b934-3f1f8f81b338" (UID: "219a69fb-a146-4034-b934-3f1f8f81b338"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147753 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147776 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147785 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147796 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147806 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.152225 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219a69fb-a146-4034-b934-3f1f8f81b338-kube-api-access-fct8x" (OuterVolumeSpecName: "kube-api-access-fct8x") pod "219a69fb-a146-4034-b934-3f1f8f81b338" (UID: "219a69fb-a146-4034-b934-3f1f8f81b338"). InnerVolumeSpecName "kube-api-access-fct8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.152336 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a61604-9f49-4b8f-8534-707af35c4667-kube-api-access-nqsg2" (OuterVolumeSpecName: "kube-api-access-nqsg2") pod "48a61604-9f49-4b8f-8534-707af35c4667" (UID: "48a61604-9f49-4b8f-8534-707af35c4667"). InnerVolumeSpecName "kube-api-access-nqsg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.152423 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a61604-9f49-4b8f-8534-707af35c4667-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "48a61604-9f49-4b8f-8534-707af35c4667" (UID: "48a61604-9f49-4b8f-8534-707af35c4667"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.171095 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/219a69fb-a146-4034-b934-3f1f8f81b338-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "219a69fb-a146-4034-b934-3f1f8f81b338" (UID: "219a69fb-a146-4034-b934-3f1f8f81b338"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.249451 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a61604-9f49-4b8f-8534-707af35c4667-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.249494 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fct8x\" (UniqueName: \"kubernetes.io/projected/219a69fb-a146-4034-b934-3f1f8f81b338-kube-api-access-fct8x\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.249506 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqsg2\" (UniqueName: \"kubernetes.io/projected/48a61604-9f49-4b8f-8534-707af35c4667-kube-api-access-nqsg2\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.249515 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a69fb-a146-4034-b934-3f1f8f81b338-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.392317 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-546cd7f689-ltpnp"] Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.396116 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-546cd7f689-ltpnp"] Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.404630 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx"] Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.407262 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx"] Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.962375 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8dd4584bf-swxjl"] Mar 10 09:07:59 crc kubenswrapper[4883]: E0310 09:07:59.962667 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a61604-9f49-4b8f-8534-707af35c4667" containerName="route-controller-manager" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.962681 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a61604-9f49-4b8f-8534-707af35c4667" containerName="route-controller-manager" Mar 10 09:07:59 crc kubenswrapper[4883]: E0310 09:07:59.962710 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219a69fb-a146-4034-b934-3f1f8f81b338" containerName="controller-manager" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.962716 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="219a69fb-a146-4034-b934-3f1f8f81b338" containerName="controller-manager" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.962827 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="48a61604-9f49-4b8f-8534-707af35c4667" containerName="route-controller-manager" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.962844 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="219a69fb-a146-4034-b934-3f1f8f81b338" containerName="controller-manager" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.963317 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.964966 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj"] Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.965432 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.965661 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.965920 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.966239 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.966834 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.967128 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.967740 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.969792 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.969989 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.970215 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.970815 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.970921 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.971174 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.973104 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.974784 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8dd4584bf-swxjl"] Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.976739 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj"] Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066147 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-client-ca\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066543 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-client-ca\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066779 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-serving-cert\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066837 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-config\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066868 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-proxy-ca-bundles\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066895 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gcgw\" (UniqueName: \"kubernetes.io/projected/78f401ec-703b-4789-8453-89b7a572a89a-kube-api-access-5gcgw\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066946 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78f401ec-703b-4789-8453-89b7a572a89a-serving-cert\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.067076 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-config\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.067125 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s545\" (UniqueName: \"kubernetes.io/projected/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-kube-api-access-5s545\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.086263 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219a69fb-a146-4034-b934-3f1f8f81b338" path="/var/lib/kubelet/pods/219a69fb-a146-4034-b934-3f1f8f81b338/volumes" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.086786 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48a61604-9f49-4b8f-8534-707af35c4667" path="/var/lib/kubelet/pods/48a61604-9f49-4b8f-8534-707af35c4667/volumes" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.132491 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552228-kn7mm"] Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.133163 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.135206 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.135355 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.135462 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.139493 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552228-kn7mm"] Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168134 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78f401ec-703b-4789-8453-89b7a572a89a-serving-cert\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168426 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-config\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168453 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s545\" (UniqueName: \"kubernetes.io/projected/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-kube-api-access-5s545\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168560 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-client-ca\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168582 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-client-ca\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168627 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f4xr\" (UniqueName: \"kubernetes.io/projected/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e-kube-api-access-7f4xr\") pod \"auto-csr-approver-29552228-kn7mm\" (UID: \"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e\") " pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168684 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-serving-cert\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168712 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-config\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168728 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-proxy-ca-bundles\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168755 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gcgw\" (UniqueName: \"kubernetes.io/projected/78f401ec-703b-4789-8453-89b7a572a89a-kube-api-access-5gcgw\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.170287 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-config\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.170297 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-client-ca\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.170299 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-client-ca\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.170352 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-proxy-ca-bundles\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.170340 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-config\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.172659 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78f401ec-703b-4789-8453-89b7a572a89a-serving-cert\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.173226 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-serving-cert\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.182601 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s545\" (UniqueName: \"kubernetes.io/projected/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-kube-api-access-5s545\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.184102 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gcgw\" (UniqueName: \"kubernetes.io/projected/78f401ec-703b-4789-8453-89b7a572a89a-kube-api-access-5gcgw\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.270026 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f4xr\" (UniqueName: \"kubernetes.io/projected/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e-kube-api-access-7f4xr\") pod \"auto-csr-approver-29552228-kn7mm\" (UID: \"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e\") " pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.280042 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.287468 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.289206 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f4xr\" (UniqueName: \"kubernetes.io/projected/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e-kube-api-access-7f4xr\") pod \"auto-csr-approver-29552228-kn7mm\" (UID: \"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e\") " pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.312840 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.370951 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68a11b02-0067-46ed-84fe-c764e88b2810-kubelet-dir\") pod \"68a11b02-0067-46ed-84fe-c764e88b2810\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.371031 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68a11b02-0067-46ed-84fe-c764e88b2810-kube-api-access\") pod \"68a11b02-0067-46ed-84fe-c764e88b2810\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.371072 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68a11b02-0067-46ed-84fe-c764e88b2810-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "68a11b02-0067-46ed-84fe-c764e88b2810" (UID: "68a11b02-0067-46ed-84fe-c764e88b2810"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.371315 4883 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68a11b02-0067-46ed-84fe-c764e88b2810-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.375426 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a11b02-0067-46ed-84fe-c764e88b2810-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "68a11b02-0067-46ed-84fe-c764e88b2810" (UID: "68a11b02-0067-46ed-84fe-c764e88b2810"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.443949 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.472015 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68a11b02-0067-46ed-84fe-c764e88b2810-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.497555 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.534590 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.598633 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552228-kn7mm"] Mar 10 09:08:00 crc kubenswrapper[4883]: W0310 09:08:00.603368 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc07643a3_c0a9_4770_a08e_ab4fb32dfe8e.slice/crio-f01f467f7c3e18c4fbdb02f54b5108e0c63dc90a5497bfdf48f4ce08a8ebe34a WatchSource:0}: Error finding container f01f467f7c3e18c4fbdb02f54b5108e0c63dc90a5497bfdf48f4ce08a8ebe34a: Status 404 returned error can't find the container with id f01f467f7c3e18c4fbdb02f54b5108e0c63dc90a5497bfdf48f4ce08a8ebe34a Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.620633 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.657331 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj"] Mar 10 09:08:00 crc kubenswrapper[4883]: W0310 09:08:00.659361 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f1f59ff_042d_4e9f_a4d9_06a1d99492cc.slice/crio-199d7977a0ba3670d2ef5d01bcafbeb1e1c791e7ffa2db6a44ea7cd8d65163fd WatchSource:0}: Error finding container 199d7977a0ba3670d2ef5d01bcafbeb1e1c791e7ffa2db6a44ea7cd8d65163fd: Status 404 returned error can't find the container with id 199d7977a0ba3670d2ef5d01bcafbeb1e1c791e7ffa2db6a44ea7cd8d65163fd Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.662860 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.700636 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8dd4584bf-swxjl"] Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.836261 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.875114 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.045178 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.077116 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.088117 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" event={"ID":"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e","Type":"ContainerStarted","Data":"f01f467f7c3e18c4fbdb02f54b5108e0c63dc90a5497bfdf48f4ce08a8ebe34a"} Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.090512 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" event={"ID":"78f401ec-703b-4789-8453-89b7a572a89a","Type":"ContainerStarted","Data":"2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5"} Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.090550 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" event={"ID":"78f401ec-703b-4789-8453-89b7a572a89a","Type":"ContainerStarted","Data":"f055423efedb6df834dff4c56e0e74a59306fd98bac74cffa507280ee1ff3f83"} Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.090666 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.092247 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"68a11b02-0067-46ed-84fe-c764e88b2810","Type":"ContainerDied","Data":"617b5bb81242e03395057fe95d456d123903cf5a9eb30f13bfcebeef10f8dca9"} Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.092278 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="617b5bb81242e03395057fe95d456d123903cf5a9eb30f13bfcebeef10f8dca9" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.092330 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.094431 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.095222 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" event={"ID":"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc","Type":"ContainerStarted","Data":"691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb"} Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.095250 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" event={"ID":"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc","Type":"ContainerStarted","Data":"199d7977a0ba3670d2ef5d01bcafbeb1e1c791e7ffa2db6a44ea7cd8d65163fd"} Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.108223 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" podStartSLOduration=3.1082113 podStartE2EDuration="3.1082113s" podCreationTimestamp="2026-03-10 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:08:01.105947075 +0000 UTC m=+267.360844965" watchObservedRunningTime="2026-03-10 09:08:01.1082113 +0000 UTC m=+267.363109189" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.956061 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" podStartSLOduration=3.956041359 podStartE2EDuration="3.956041359s" podCreationTimestamp="2026-03-10 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:08:01.129093663 +0000 UTC m=+267.383991552" watchObservedRunningTime="2026-03-10 09:08:01.956041359 +0000 UTC m=+268.210939248" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.956235 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cwsth"] Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.101990 4883 generic.go:334] "Generic (PLEG): container finished" podID="c07643a3-c0a9-4770-a08e-ab4fb32dfe8e" containerID="8d2862eee27c865a5680228f73b67899d38c264111c25020319e7ec39c7a9c80" exitCode=0 Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.102041 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" event={"ID":"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e","Type":"ContainerDied","Data":"8d2862eee27c865a5680228f73b67899d38c264111c25020319e7ec39c7a9c80"} Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.102641 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.102804 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cwsth" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="registry-server" containerID="cri-o://3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28" gracePeriod=2 Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.108123 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.427707 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.497742 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-catalog-content\") pod \"f98b0611-9639-4987-9e9b-0e1c4695a164\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.497804 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz6z4\" (UniqueName: \"kubernetes.io/projected/f98b0611-9639-4987-9e9b-0e1c4695a164-kube-api-access-cz6z4\") pod \"f98b0611-9639-4987-9e9b-0e1c4695a164\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.499743 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-utilities\") pod \"f98b0611-9639-4987-9e9b-0e1c4695a164\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.500499 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-utilities" (OuterVolumeSpecName: "utilities") pod "f98b0611-9639-4987-9e9b-0e1c4695a164" (UID: "f98b0611-9639-4987-9e9b-0e1c4695a164"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.501431 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.505635 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f98b0611-9639-4987-9e9b-0e1c4695a164-kube-api-access-cz6z4" (OuterVolumeSpecName: "kube-api-access-cz6z4") pod "f98b0611-9639-4987-9e9b-0e1c4695a164" (UID: "f98b0611-9639-4987-9e9b-0e1c4695a164"). InnerVolumeSpecName "kube-api-access-cz6z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.543730 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f98b0611-9639-4987-9e9b-0e1c4695a164" (UID: "f98b0611-9639-4987-9e9b-0e1c4695a164"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.603267 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.603305 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz6z4\" (UniqueName: \"kubernetes.io/projected/f98b0611-9639-4987-9e9b-0e1c4695a164-kube-api-access-cz6z4\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.949831 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dtqw6"] Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.950345 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dtqw6" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="registry-server" containerID="cri-o://a4dabd119299b8b7187e538e04ec43348adce3c73e526c74a61e41615d67afab" gracePeriod=2 Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.111653 4883 generic.go:334] "Generic (PLEG): container finished" podID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerID="a4dabd119299b8b7187e538e04ec43348adce3c73e526c74a61e41615d67afab" exitCode=0 Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.111730 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtqw6" event={"ID":"804d1679-c6e1-4594-b067-c41da8ee64ab","Type":"ContainerDied","Data":"a4dabd119299b8b7187e538e04ec43348adce3c73e526c74a61e41615d67afab"} Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.114104 4883 generic.go:334] "Generic (PLEG): container finished" podID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerID="3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28" exitCode=0 Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.114347 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.115503 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwsth" event={"ID":"f98b0611-9639-4987-9e9b-0e1c4695a164","Type":"ContainerDied","Data":"3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28"} Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.115568 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwsth" event={"ID":"f98b0611-9639-4987-9e9b-0e1c4695a164","Type":"ContainerDied","Data":"d460ae50500ac6d093c81ce23834fe61bddb693cf03ca9139dc421ae40806219"} Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.115595 4883 scope.go:117] "RemoveContainer" containerID="3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.131732 4883 scope.go:117] "RemoveContainer" containerID="e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.147045 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cwsth"] Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.159216 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cwsth"] Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.168095 4883 scope.go:117] "RemoveContainer" containerID="09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.186732 4883 scope.go:117] "RemoveContainer" containerID="3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.187424 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28\": container with ID starting with 3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28 not found: ID does not exist" containerID="3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.187497 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28"} err="failed to get container status \"3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28\": rpc error: code = NotFound desc = could not find container \"3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28\": container with ID starting with 3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28 not found: ID does not exist" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.187546 4883 scope.go:117] "RemoveContainer" containerID="e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.187967 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050\": container with ID starting with e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050 not found: ID does not exist" containerID="e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.188017 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050"} err="failed to get container status \"e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050\": rpc error: code = NotFound desc = could not find container \"e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050\": container with ID starting with e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050 not found: ID does not exist" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.188050 4883 scope.go:117] "RemoveContainer" containerID="09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.188659 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd\": container with ID starting with 09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd not found: ID does not exist" containerID="09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.188691 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd"} err="failed to get container status \"09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd\": rpc error: code = NotFound desc = could not find container \"09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd\": container with ID starting with 09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd not found: ID does not exist" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.318679 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.367171 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.413600 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-utilities\") pod \"804d1679-c6e1-4594-b067-c41da8ee64ab\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.413670 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f4xr\" (UniqueName: \"kubernetes.io/projected/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e-kube-api-access-7f4xr\") pod \"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e\" (UID: \"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e\") " Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.413737 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9hhp\" (UniqueName: \"kubernetes.io/projected/804d1679-c6e1-4594-b067-c41da8ee64ab-kube-api-access-v9hhp\") pod \"804d1679-c6e1-4594-b067-c41da8ee64ab\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.413779 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-catalog-content\") pod \"804d1679-c6e1-4594-b067-c41da8ee64ab\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.414365 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-utilities" (OuterVolumeSpecName: "utilities") pod "804d1679-c6e1-4594-b067-c41da8ee64ab" (UID: "804d1679-c6e1-4594-b067-c41da8ee64ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.418046 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e-kube-api-access-7f4xr" (OuterVolumeSpecName: "kube-api-access-7f4xr") pod "c07643a3-c0a9-4770-a08e-ab4fb32dfe8e" (UID: "c07643a3-c0a9-4770-a08e-ab4fb32dfe8e"). InnerVolumeSpecName "kube-api-access-7f4xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.418099 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804d1679-c6e1-4594-b067-c41da8ee64ab-kube-api-access-v9hhp" (OuterVolumeSpecName: "kube-api-access-v9hhp") pod "804d1679-c6e1-4594-b067-c41da8ee64ab" (UID: "804d1679-c6e1-4594-b067-c41da8ee64ab"). InnerVolumeSpecName "kube-api-access-v9hhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.458440 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "804d1679-c6e1-4594-b067-c41da8ee64ab" (UID: "804d1679-c6e1-4594-b067-c41da8ee64ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.515683 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.515718 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f4xr\" (UniqueName: \"kubernetes.io/projected/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e-kube-api-access-7f4xr\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.515732 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9hhp\" (UniqueName: \"kubernetes.io/projected/804d1679-c6e1-4594-b067-c41da8ee64ab-kube-api-access-v9hhp\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.515741 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.625599 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.660728 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.687989 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688434 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c07643a3-c0a9-4770-a08e-ab4fb32dfe8e" containerName="oc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688460 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c07643a3-c0a9-4770-a08e-ab4fb32dfe8e" containerName="oc" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688515 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="extract-utilities" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688526 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="extract-utilities" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688543 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="extract-content" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688551 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="extract-content" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688560 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="registry-server" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688570 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="registry-server" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688585 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="extract-content" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688596 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="extract-content" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688611 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="extract-utilities" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688618 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="extract-utilities" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688628 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a11b02-0067-46ed-84fe-c764e88b2810" containerName="pruner" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688634 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a11b02-0067-46ed-84fe-c764e88b2810" containerName="pruner" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688647 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="registry-server" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688654 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="registry-server" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688818 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="registry-server" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688839 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a11b02-0067-46ed-84fe-c764e88b2810" containerName="pruner" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688847 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="registry-server" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688858 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c07643a3-c0a9-4770-a08e-ab4fb32dfe8e" containerName="oc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.689664 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.699777 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.703345 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.704145 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.719281 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-var-lock\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.719388 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/176e0284-eb7e-40ac-8466-c3fab8836176-kube-api-access\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.719455 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-kubelet-dir\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.820916 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-var-lock\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.820998 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/176e0284-eb7e-40ac-8466-c3fab8836176-kube-api-access\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.821044 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-var-lock\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.821219 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-kubelet-dir\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.821036 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-kubelet-dir\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.836803 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/176e0284-eb7e-40ac-8466-c3fab8836176-kube-api-access\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.012133 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.025098 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.063768 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.110428 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" path="/var/lib/kubelet/pods/f98b0611-9639-4987-9e9b-0e1c4695a164/volumes" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.122924 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" event={"ID":"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e","Type":"ContainerDied","Data":"f01f467f7c3e18c4fbdb02f54b5108e0c63dc90a5497bfdf48f4ce08a8ebe34a"} Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.122936 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.122949 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f01f467f7c3e18c4fbdb02f54b5108e0c63dc90a5497bfdf48f4ce08a8ebe34a" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.128162 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.129549 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtqw6" event={"ID":"804d1679-c6e1-4594-b067-c41da8ee64ab","Type":"ContainerDied","Data":"7fa6f7eaccd4ad67921504c2ac2d68f6a5036f11199646ded41154976ac1c2dc"} Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.129792 4883 scope.go:117] "RemoveContainer" containerID="a4dabd119299b8b7187e538e04ec43348adce3c73e526c74a61e41615d67afab" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.145212 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dtqw6"] Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.148668 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dtqw6"] Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.150258 4883 scope.go:117] "RemoveContainer" containerID="4b273f81ff4709aae90e59a96f5a8a9a4b8f566fb0b34b2f69e7956ff48dc97f" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.172822 4883 scope.go:117] "RemoveContainer" containerID="fbcddb1697b17646b99e5eb8976196f07f60605511e41f78c1cc131392bba69c" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.383181 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 09:08:05 crc kubenswrapper[4883]: I0310 09:08:05.136409 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"176e0284-eb7e-40ac-8466-c3fab8836176","Type":"ContainerStarted","Data":"3787b6ad2bf41b839ef8b984df31432624a52240182015408ed618baf0c018de"} Mar 10 09:08:05 crc kubenswrapper[4883]: I0310 09:08:05.136778 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"176e0284-eb7e-40ac-8466-c3fab8836176","Type":"ContainerStarted","Data":"552a1b11d60a599de3abec512c8277695d95d44418766139907a9fdf62fd5f01"} Mar 10 09:08:05 crc kubenswrapper[4883]: I0310 09:08:05.151528 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.151507874 podStartE2EDuration="2.151507874s" podCreationTimestamp="2026-03-10 09:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:08:05.14844562 +0000 UTC m=+271.403343509" watchObservedRunningTime="2026-03-10 09:08:05.151507874 +0000 UTC m=+271.406405763" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.088771 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" path="/var/lib/kubelet/pods/804d1679-c6e1-4594-b067-c41da8ee64ab/volumes" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.349759 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j5rwl"] Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.350104 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j5rwl" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="registry-server" containerID="cri-o://a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942" gracePeriod=2 Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.716222 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.758505 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-catalog-content\") pod \"504544f8-69a4-4562-87d5-fa61335ea052\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.758644 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-utilities\") pod \"504544f8-69a4-4562-87d5-fa61335ea052\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.758870 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfh75\" (UniqueName: \"kubernetes.io/projected/504544f8-69a4-4562-87d5-fa61335ea052-kube-api-access-zfh75\") pod \"504544f8-69a4-4562-87d5-fa61335ea052\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.759258 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-utilities" (OuterVolumeSpecName: "utilities") pod "504544f8-69a4-4562-87d5-fa61335ea052" (UID: "504544f8-69a4-4562-87d5-fa61335ea052"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.759426 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.764014 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/504544f8-69a4-4562-87d5-fa61335ea052-kube-api-access-zfh75" (OuterVolumeSpecName: "kube-api-access-zfh75") pod "504544f8-69a4-4562-87d5-fa61335ea052" (UID: "504544f8-69a4-4562-87d5-fa61335ea052"). InnerVolumeSpecName "kube-api-access-zfh75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.851129 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "504544f8-69a4-4562-87d5-fa61335ea052" (UID: "504544f8-69a4-4562-87d5-fa61335ea052"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.861298 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfh75\" (UniqueName: \"kubernetes.io/projected/504544f8-69a4-4562-87d5-fa61335ea052-kube-api-access-zfh75\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.861534 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.151655 4883 generic.go:334] "Generic (PLEG): container finished" podID="504544f8-69a4-4562-87d5-fa61335ea052" containerID="a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942" exitCode=0 Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.151705 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5rwl" event={"ID":"504544f8-69a4-4562-87d5-fa61335ea052","Type":"ContainerDied","Data":"a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942"} Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.151740 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5rwl" event={"ID":"504544f8-69a4-4562-87d5-fa61335ea052","Type":"ContainerDied","Data":"a0d2e9ecb5144e13f35ee0d106385391edc44733dcd6e7ac3cc5e5bac91f227d"} Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.151740 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.151760 4883 scope.go:117] "RemoveContainer" containerID="a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.167152 4883 scope.go:117] "RemoveContainer" containerID="f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.174124 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j5rwl"] Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.184757 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j5rwl"] Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.194849 4883 scope.go:117] "RemoveContainer" containerID="d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.208170 4883 scope.go:117] "RemoveContainer" containerID="a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942" Mar 10 09:08:07 crc kubenswrapper[4883]: E0310 09:08:07.208615 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942\": container with ID starting with a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942 not found: ID does not exist" containerID="a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.208667 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942"} err="failed to get container status \"a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942\": rpc error: code = NotFound desc = could not find container \"a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942\": container with ID starting with a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942 not found: ID does not exist" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.208689 4883 scope.go:117] "RemoveContainer" containerID="f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f" Mar 10 09:08:07 crc kubenswrapper[4883]: E0310 09:08:07.209124 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f\": container with ID starting with f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f not found: ID does not exist" containerID="f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.209163 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f"} err="failed to get container status \"f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f\": rpc error: code = NotFound desc = could not find container \"f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f\": container with ID starting with f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f not found: ID does not exist" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.209177 4883 scope.go:117] "RemoveContainer" containerID="d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11" Mar 10 09:08:07 crc kubenswrapper[4883]: E0310 09:08:07.209445 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11\": container with ID starting with d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11 not found: ID does not exist" containerID="d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.209508 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11"} err="failed to get container status \"d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11\": rpc error: code = NotFound desc = could not find container \"d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11\": container with ID starting with d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11 not found: ID does not exist" Mar 10 09:08:08 crc kubenswrapper[4883]: I0310 09:08:08.086075 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="504544f8-69a4-4562-87d5-fa61335ea052" path="/var/lib/kubelet/pods/504544f8-69a4-4562-87d5-fa61335ea052/volumes" Mar 10 09:08:12 crc kubenswrapper[4883]: I0310 09:08:12.688278 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7986"] Mar 10 09:08:17 crc kubenswrapper[4883]: I0310 09:08:17.449524 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:08:17 crc kubenswrapper[4883]: I0310 09:08:17.450046 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:08:17 crc kubenswrapper[4883]: I0310 09:08:17.450116 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:08:17 crc kubenswrapper[4883]: I0310 09:08:17.450979 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:08:17 crc kubenswrapper[4883]: I0310 09:08:17.451053 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8" gracePeriod=600 Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.210251 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8" exitCode=0 Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.210330 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8"} Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.211025 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a"} Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.477054 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8dd4584bf-swxjl"] Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.477303 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" podUID="78f401ec-703b-4789-8453-89b7a572a89a" containerName="controller-manager" containerID="cri-o://2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5" gracePeriod=30 Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.491314 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj"] Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.491576 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" podUID="6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" containerName="route-controller-manager" containerID="cri-o://691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb" gracePeriod=30 Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.991978 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.997196 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013225 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78f401ec-703b-4789-8453-89b7a572a89a-serving-cert\") pod \"78f401ec-703b-4789-8453-89b7a572a89a\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013315 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-proxy-ca-bundles\") pod \"78f401ec-703b-4789-8453-89b7a572a89a\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013349 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s545\" (UniqueName: \"kubernetes.io/projected/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-kube-api-access-5s545\") pod \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013386 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-client-ca\") pod \"78f401ec-703b-4789-8453-89b7a572a89a\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013406 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-config\") pod \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013445 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-client-ca\") pod \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013540 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-config\") pod \"78f401ec-703b-4789-8453-89b7a572a89a\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013576 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-serving-cert\") pod \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013621 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gcgw\" (UniqueName: \"kubernetes.io/projected/78f401ec-703b-4789-8453-89b7a572a89a-kube-api-access-5gcgw\") pod \"78f401ec-703b-4789-8453-89b7a572a89a\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.014157 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "78f401ec-703b-4789-8453-89b7a572a89a" (UID: "78f401ec-703b-4789-8453-89b7a572a89a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.014239 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-client-ca" (OuterVolumeSpecName: "client-ca") pod "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" (UID: "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.014285 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-client-ca" (OuterVolumeSpecName: "client-ca") pod "78f401ec-703b-4789-8453-89b7a572a89a" (UID: "78f401ec-703b-4789-8453-89b7a572a89a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.014350 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-config" (OuterVolumeSpecName: "config") pod "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" (UID: "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.014377 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-config" (OuterVolumeSpecName: "config") pod "78f401ec-703b-4789-8453-89b7a572a89a" (UID: "78f401ec-703b-4789-8453-89b7a572a89a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.019823 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f401ec-703b-4789-8453-89b7a572a89a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "78f401ec-703b-4789-8453-89b7a572a89a" (UID: "78f401ec-703b-4789-8453-89b7a572a89a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.020287 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" (UID: "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.021300 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-kube-api-access-5s545" (OuterVolumeSpecName: "kube-api-access-5s545") pod "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" (UID: "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc"). InnerVolumeSpecName "kube-api-access-5s545". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.021491 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f401ec-703b-4789-8453-89b7a572a89a-kube-api-access-5gcgw" (OuterVolumeSpecName: "kube-api-access-5gcgw") pod "78f401ec-703b-4789-8453-89b7a572a89a" (UID: "78f401ec-703b-4789-8453-89b7a572a89a"). InnerVolumeSpecName "kube-api-access-5gcgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115807 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78f401ec-703b-4789-8453-89b7a572a89a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115856 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115870 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s545\" (UniqueName: \"kubernetes.io/projected/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-kube-api-access-5s545\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115880 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115896 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115905 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115914 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115922 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115932 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gcgw\" (UniqueName: \"kubernetes.io/projected/78f401ec-703b-4789-8453-89b7a572a89a-kube-api-access-5gcgw\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.216318 4883 generic.go:334] "Generic (PLEG): container finished" podID="6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" containerID="691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb" exitCode=0 Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.216394 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" event={"ID":"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc","Type":"ContainerDied","Data":"691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb"} Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.216448 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" event={"ID":"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc","Type":"ContainerDied","Data":"199d7977a0ba3670d2ef5d01bcafbeb1e1c791e7ffa2db6a44ea7cd8d65163fd"} Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.216466 4883 scope.go:117] "RemoveContainer" containerID="691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.216588 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.218608 4883 generic.go:334] "Generic (PLEG): container finished" podID="78f401ec-703b-4789-8453-89b7a572a89a" containerID="2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5" exitCode=0 Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.218655 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" event={"ID":"78f401ec-703b-4789-8453-89b7a572a89a","Type":"ContainerDied","Data":"2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5"} Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.218687 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" event={"ID":"78f401ec-703b-4789-8453-89b7a572a89a","Type":"ContainerDied","Data":"f055423efedb6df834dff4c56e0e74a59306fd98bac74cffa507280ee1ff3f83"} Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.218743 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.230143 4883 scope.go:117] "RemoveContainer" containerID="691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb" Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.230705 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb\": container with ID starting with 691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb not found: ID does not exist" containerID="691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.230742 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb"} err="failed to get container status \"691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb\": rpc error: code = NotFound desc = could not find container \"691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb\": container with ID starting with 691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb not found: ID does not exist" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.230765 4883 scope.go:117] "RemoveContainer" containerID="2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.240347 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj"] Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.241878 4883 scope.go:117] "RemoveContainer" containerID="2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5" Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.242662 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5\": container with ID starting with 2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5 not found: ID does not exist" containerID="2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.242707 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5"} err="failed to get container status \"2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5\": rpc error: code = NotFound desc = could not find container \"2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5\": container with ID starting with 2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5 not found: ID does not exist" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.244521 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj"] Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.249694 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8dd4584bf-swxjl"] Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.252987 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8dd4584bf-swxjl"] Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.975529 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b6c7d57cc-k595r"] Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.975990 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="extract-content" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976010 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="extract-content" Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.976038 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="registry-server" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976044 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="registry-server" Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.976054 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="extract-utilities" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976062 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="extract-utilities" Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.976076 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" containerName="route-controller-manager" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976082 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" containerName="route-controller-manager" Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.976090 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f401ec-703b-4789-8453-89b7a572a89a" containerName="controller-manager" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976098 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f401ec-703b-4789-8453-89b7a572a89a" containerName="controller-manager" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976304 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f401ec-703b-4789-8453-89b7a572a89a" containerName="controller-manager" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976315 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="registry-server" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976341 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" containerName="route-controller-manager" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.977097 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.979100 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv"] Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.980239 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.980276 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.980921 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.981062 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.981166 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.981315 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.981595 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.983625 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.983787 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.984230 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.984392 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.985058 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.985302 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.989884 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.991289 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c7d57cc-k595r"] Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.996883 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv"] Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.086983 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" path="/var/lib/kubelet/pods/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc/volumes" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.087753 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f401ec-703b-4789-8453-89b7a572a89a" path="/var/lib/kubelet/pods/78f401ec-703b-4789-8453-89b7a572a89a/volumes" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131599 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-config\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131641 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-serving-cert\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131678 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsz8r\" (UniqueName: \"kubernetes.io/projected/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-kube-api-access-hsz8r\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131701 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbdf4\" (UniqueName: \"kubernetes.io/projected/16dd6ea4-f278-458c-b9e2-93085190d1b3-kube-api-access-hbdf4\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131732 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dd6ea4-f278-458c-b9e2-93085190d1b3-serving-cert\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131748 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-client-ca\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131770 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dd6ea4-f278-458c-b9e2-93085190d1b3-client-ca\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131794 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-proxy-ca-bundles\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131853 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dd6ea4-f278-458c-b9e2-93085190d1b3-config\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233189 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dd6ea4-f278-458c-b9e2-93085190d1b3-config\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233320 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-config\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233367 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-serving-cert\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233393 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsz8r\" (UniqueName: \"kubernetes.io/projected/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-kube-api-access-hsz8r\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233428 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbdf4\" (UniqueName: \"kubernetes.io/projected/16dd6ea4-f278-458c-b9e2-93085190d1b3-kube-api-access-hbdf4\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233465 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dd6ea4-f278-458c-b9e2-93085190d1b3-serving-cert\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233511 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-client-ca\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233537 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dd6ea4-f278-458c-b9e2-93085190d1b3-client-ca\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233581 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-proxy-ca-bundles\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.234682 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dd6ea4-f278-458c-b9e2-93085190d1b3-config\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.234790 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-client-ca\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.234995 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dd6ea4-f278-458c-b9e2-93085190d1b3-client-ca\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.235250 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-proxy-ca-bundles\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.235422 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-config\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.240813 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dd6ea4-f278-458c-b9e2-93085190d1b3-serving-cert\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.242604 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-serving-cert\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.248350 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbdf4\" (UniqueName: \"kubernetes.io/projected/16dd6ea4-f278-458c-b9e2-93085190d1b3-kube-api-access-hbdf4\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.249877 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsz8r\" (UniqueName: \"kubernetes.io/projected/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-kube-api-access-hsz8r\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.302402 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.310829 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.688312 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv"] Mar 10 09:08:20 crc kubenswrapper[4883]: W0310 09:08:20.696096 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16dd6ea4_f278_458c_b9e2_93085190d1b3.slice/crio-8dedfd1b93d833fac5844ad449c87c891ec39ca513f553f0b4d046e63e09495c WatchSource:0}: Error finding container 8dedfd1b93d833fac5844ad449c87c891ec39ca513f553f0b4d046e63e09495c: Status 404 returned error can't find the container with id 8dedfd1b93d833fac5844ad449c87c891ec39ca513f553f0b4d046e63e09495c Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.736191 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c7d57cc-k595r"] Mar 10 09:08:20 crc kubenswrapper[4883]: W0310 09:08:20.738361 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8dbef90_2f02_4ca8_9c38_cbb026c82e5b.slice/crio-a18a4be07e0b7004fbad321acf29391a6a83963183f58c90b51d99a5d576973c WatchSource:0}: Error finding container a18a4be07e0b7004fbad321acf29391a6a83963183f58c90b51d99a5d576973c: Status 404 returned error can't find the container with id a18a4be07e0b7004fbad321acf29391a6a83963183f58c90b51d99a5d576973c Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.234716 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" event={"ID":"16dd6ea4-f278-458c-b9e2-93085190d1b3","Type":"ContainerStarted","Data":"5c961f531b0dc68502c29b99ffd643a7d1ee02388ec051eb965974afd188944f"} Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.234765 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" event={"ID":"16dd6ea4-f278-458c-b9e2-93085190d1b3","Type":"ContainerStarted","Data":"8dedfd1b93d833fac5844ad449c87c891ec39ca513f553f0b4d046e63e09495c"} Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.234935 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.236822 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" event={"ID":"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b","Type":"ContainerStarted","Data":"1533e92453d6691b6336f7c80535c9c0fc5afda3dc6cb698c4a2e15dd630f784"} Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.236854 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" event={"ID":"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b","Type":"ContainerStarted","Data":"a18a4be07e0b7004fbad321acf29391a6a83963183f58c90b51d99a5d576973c"} Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.237026 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.240263 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.251189 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" podStartSLOduration=3.251177794 podStartE2EDuration="3.251177794s" podCreationTimestamp="2026-03-10 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:08:21.248774298 +0000 UTC m=+287.503672188" watchObservedRunningTime="2026-03-10 09:08:21.251177794 +0000 UTC m=+287.506075682" Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.251645 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.266552 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" podStartSLOduration=3.266539435 podStartE2EDuration="3.266539435s" podCreationTimestamp="2026-03-10 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:08:21.265012816 +0000 UTC m=+287.519910706" watchObservedRunningTime="2026-03-10 09:08:21.266539435 +0000 UTC m=+287.521437325" Mar 10 09:08:37 crc kubenswrapper[4883]: I0310 09:08:37.710055 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" podUID="90f9a6f1-0760-4398-80cf-70c615c7032d" containerName="oauth-openshift" containerID="cri-o://9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03" gracePeriod=15 Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.129238 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.156669 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66f68474cb-cmj6s"] Mar 10 09:08:38 crc kubenswrapper[4883]: E0310 09:08:38.156886 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f9a6f1-0760-4398-80cf-70c615c7032d" containerName="oauth-openshift" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.156906 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f9a6f1-0760-4398-80cf-70c615c7032d" containerName="oauth-openshift" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.156995 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f9a6f1-0760-4398-80cf-70c615c7032d" containerName="oauth-openshift" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.157353 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.167843 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66f68474cb-cmj6s"] Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.240300 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-serving-cert\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.240349 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-provider-selection\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.240386 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-dir\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.240441 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-error\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241154 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-policies\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.240522 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241199 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-idp-0-file-data\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241227 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-cliconfig\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241253 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-session\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241272 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-trusted-ca-bundle\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241289 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-service-ca\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241305 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-ocp-branding-template\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241330 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-router-certs\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241376 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lm8b\" (UniqueName: \"kubernetes.io/projected/90f9a6f1-0760-4398-80cf-70c615c7032d-kube-api-access-9lm8b\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241395 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-login\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241505 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241542 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d82da9-1a67-4bb0-9a5b-21e2642140bf-audit-dir\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241599 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-login\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241630 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241680 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241702 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-session\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241788 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241808 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241833 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241862 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241879 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-audit-policies\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241909 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241939 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-error\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241958 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdtvx\" (UniqueName: \"kubernetes.io/projected/27d82da9-1a67-4bb0-9a5b-21e2642140bf-kube-api-access-kdtvx\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241963 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241996 4883 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.242073 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.242115 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.242151 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.246461 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.246910 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.246954 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90f9a6f1-0760-4398-80cf-70c615c7032d-kube-api-access-9lm8b" (OuterVolumeSpecName: "kube-api-access-9lm8b") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "kube-api-access-9lm8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.247206 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.247549 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.247639 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.247817 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.247955 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.248050 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.334180 4883 generic.go:334] "Generic (PLEG): container finished" podID="90f9a6f1-0760-4398-80cf-70c615c7032d" containerID="9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03" exitCode=0 Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.334263 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" event={"ID":"90f9a6f1-0760-4398-80cf-70c615c7032d","Type":"ContainerDied","Data":"9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03"} Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.334279 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.334320 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" event={"ID":"90f9a6f1-0760-4398-80cf-70c615c7032d","Type":"ContainerDied","Data":"4fcf975f26107b7cfd1ff1be2d34f1e281e19924c7820362af5907d5ba2ac3dc"} Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.334349 4883 scope.go:117] "RemoveContainer" containerID="9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.342711 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.342869 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.342968 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343045 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343118 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-audit-policies\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343200 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343274 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-error\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343348 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdtvx\" (UniqueName: \"kubernetes.io/projected/27d82da9-1a67-4bb0-9a5b-21e2642140bf-kube-api-access-kdtvx\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343432 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343535 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d82da9-1a67-4bb0-9a5b-21e2642140bf-audit-dir\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343647 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-login\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343732 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343808 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343894 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-session\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344008 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344077 4883 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344133 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344178 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-audit-policies\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343735 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343744 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d82da9-1a67-4bb0-9a5b-21e2642140bf-audit-dir\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344196 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344394 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344454 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344554 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344614 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344693 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344756 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lm8b\" (UniqueName: \"kubernetes.io/projected/90f9a6f1-0760-4398-80cf-70c615c7032d-kube-api-access-9lm8b\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344801 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344812 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344915 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344943 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.346366 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.346752 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-error\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.346884 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.346903 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.346895 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.347309 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.347968 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.348119 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-login\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.351058 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-session\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.356375 4883 scope.go:117] "RemoveContainer" containerID="9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03" Mar 10 09:08:38 crc kubenswrapper[4883]: E0310 09:08:38.356710 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03\": container with ID starting with 9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03 not found: ID does not exist" containerID="9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.356743 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03"} err="failed to get container status \"9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03\": rpc error: code = NotFound desc = could not find container \"9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03\": container with ID starting with 9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03 not found: ID does not exist" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.360450 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdtvx\" (UniqueName: \"kubernetes.io/projected/27d82da9-1a67-4bb0-9a5b-21e2642140bf-kube-api-access-kdtvx\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.364889 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7986"] Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.368402 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7986"] Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.472357 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.858074 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66f68474cb-cmj6s"] Mar 10 09:08:38 crc kubenswrapper[4883]: W0310 09:08:38.866086 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27d82da9_1a67_4bb0_9a5b_21e2642140bf.slice/crio-b5e5fe23b1e46f4d08b8b6205a028f87b98cb2dc3adfc1858280ee29a164cef4 WatchSource:0}: Error finding container b5e5fe23b1e46f4d08b8b6205a028f87b98cb2dc3adfc1858280ee29a164cef4: Status 404 returned error can't find the container with id b5e5fe23b1e46f4d08b8b6205a028f87b98cb2dc3adfc1858280ee29a164cef4 Mar 10 09:08:39 crc kubenswrapper[4883]: I0310 09:08:39.345086 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" event={"ID":"27d82da9-1a67-4bb0-9a5b-21e2642140bf","Type":"ContainerStarted","Data":"d78b6114971fd5b7b5f2587d1a0058b5af2181eb8c8460aacfe68b636d07ce7a"} Mar 10 09:08:39 crc kubenswrapper[4883]: I0310 09:08:39.345418 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:39 crc kubenswrapper[4883]: I0310 09:08:39.345433 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" event={"ID":"27d82da9-1a67-4bb0-9a5b-21e2642140bf","Type":"ContainerStarted","Data":"b5e5fe23b1e46f4d08b8b6205a028f87b98cb2dc3adfc1858280ee29a164cef4"} Mar 10 09:08:39 crc kubenswrapper[4883]: I0310 09:08:39.350011 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:39 crc kubenswrapper[4883]: I0310 09:08:39.376354 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" podStartSLOduration=27.376325152 podStartE2EDuration="27.376325152s" podCreationTimestamp="2026-03-10 09:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:08:39.374036895 +0000 UTC m=+305.628934784" watchObservedRunningTime="2026-03-10 09:08:39.376325152 +0000 UTC m=+305.631223042" Mar 10 09:08:40 crc kubenswrapper[4883]: I0310 09:08:40.086658 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90f9a6f1-0760-4398-80cf-70c615c7032d" path="/var/lib/kubelet/pods/90f9a6f1-0760-4398-80cf-70c615c7032d/volumes" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.377488 4883 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378292 4883 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378427 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378625 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3" gracePeriod=15 Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378685 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef" gracePeriod=15 Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378730 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada" gracePeriod=15 Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378688 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0" gracePeriod=15 Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378748 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea" gracePeriod=15 Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.379847 4883 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380279 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380323 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380334 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380341 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380750 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380771 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380783 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380789 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380797 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380803 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380819 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380825 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380832 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380838 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380846 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380852 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381032 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381043 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381053 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381059 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381066 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381074 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381083 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381091 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.381191 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381199 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.381208 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381214 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381347 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397034 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397074 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397100 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397127 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397196 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397324 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397398 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397464 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504301 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504445 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504565 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504660 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504508 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504626 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504384 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504791 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504918 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504990 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.505056 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.505092 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.505238 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.505284 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.505381 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.505399 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.245316 4883 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.246119 4883 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.246795 4883 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.247100 4883 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.247455 4883 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.247510 4883 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.247734 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="200ms" Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.369466 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.370922 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.371727 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea" exitCode=0 Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.371754 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0" exitCode=0 Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.371763 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef" exitCode=0 Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.371771 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada" exitCode=2 Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.371843 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.373132 4883 generic.go:334] "Generic (PLEG): container finished" podID="176e0284-eb7e-40ac-8466-c3fab8836176" containerID="3787b6ad2bf41b839ef8b984df31432624a52240182015408ed618baf0c018de" exitCode=0 Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.373300 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"176e0284-eb7e-40ac-8466-c3fab8836176","Type":"ContainerDied","Data":"3787b6ad2bf41b839ef8b984df31432624a52240182015408ed618baf0c018de"} Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.374011 4883 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.374322 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.448405 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="400ms" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.849263 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="800ms" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.082349 4883 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.082665 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.383108 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 09:08:44 crc kubenswrapper[4883]: E0310 09:08:44.650341 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="1.6s" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.710229 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.710898 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.711342 4883 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.711530 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.711530 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.711938 4883 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.712113 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727168 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-var-lock\") pod \"176e0284-eb7e-40ac-8466-c3fab8836176\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727220 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-kubelet-dir\") pod \"176e0284-eb7e-40ac-8466-c3fab8836176\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727236 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-var-lock" (OuterVolumeSpecName: "var-lock") pod "176e0284-eb7e-40ac-8466-c3fab8836176" (UID: "176e0284-eb7e-40ac-8466-c3fab8836176"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727253 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727284 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727306 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "176e0284-eb7e-40ac-8466-c3fab8836176" (UID: "176e0284-eb7e-40ac-8466-c3fab8836176"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727306 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727346 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727360 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727383 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727380 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/176e0284-eb7e-40ac-8466-c3fab8836176-kube-api-access\") pod \"176e0284-eb7e-40ac-8466-c3fab8836176\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727618 4883 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727637 4883 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727645 4883 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727654 4883 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727663 4883 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.732136 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/176e0284-eb7e-40ac-8466-c3fab8836176-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "176e0284-eb7e-40ac-8466-c3fab8836176" (UID: "176e0284-eb7e-40ac-8466-c3fab8836176"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.828713 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/176e0284-eb7e-40ac-8466-c3fab8836176-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.390785 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.390781 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"176e0284-eb7e-40ac-8466-c3fab8836176","Type":"ContainerDied","Data":"552a1b11d60a599de3abec512c8277695d95d44418766139907a9fdf62fd5f01"} Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.390903 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="552a1b11d60a599de3abec512c8277695d95d44418766139907a9fdf62fd5f01" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.394162 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.394990 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3" exitCode=0 Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.395058 4883 scope.go:117] "RemoveContainer" containerID="65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.395071 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.404000 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.404312 4883 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.408736 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.409130 4883 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.412299 4883 scope.go:117] "RemoveContainer" containerID="1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.425159 4883 scope.go:117] "RemoveContainer" containerID="27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.438169 4883 scope.go:117] "RemoveContainer" containerID="32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.454967 4883 scope.go:117] "RemoveContainer" containerID="6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.475111 4883 scope.go:117] "RemoveContainer" containerID="0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.494586 4883 scope.go:117] "RemoveContainer" containerID="65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea" Mar 10 09:08:45 crc kubenswrapper[4883]: E0310 09:08:45.495030 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\": container with ID starting with 65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea not found: ID does not exist" containerID="65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.495130 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea"} err="failed to get container status \"65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\": rpc error: code = NotFound desc = could not find container \"65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\": container with ID starting with 65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea not found: ID does not exist" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.495223 4883 scope.go:117] "RemoveContainer" containerID="1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0" Mar 10 09:08:45 crc kubenswrapper[4883]: E0310 09:08:45.495598 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\": container with ID starting with 1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0 not found: ID does not exist" containerID="1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.495649 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0"} err="failed to get container status \"1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\": rpc error: code = NotFound desc = could not find container \"1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\": container with ID starting with 1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0 not found: ID does not exist" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.495687 4883 scope.go:117] "RemoveContainer" containerID="27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef" Mar 10 09:08:45 crc kubenswrapper[4883]: E0310 09:08:45.496084 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\": container with ID starting with 27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef not found: ID does not exist" containerID="27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.496151 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef"} err="failed to get container status \"27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\": rpc error: code = NotFound desc = could not find container \"27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\": container with ID starting with 27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef not found: ID does not exist" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.496181 4883 scope.go:117] "RemoveContainer" containerID="32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada" Mar 10 09:08:45 crc kubenswrapper[4883]: E0310 09:08:45.496439 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\": container with ID starting with 32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada not found: ID does not exist" containerID="32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.496552 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada"} err="failed to get container status \"32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\": rpc error: code = NotFound desc = could not find container \"32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\": container with ID starting with 32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada not found: ID does not exist" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.496647 4883 scope.go:117] "RemoveContainer" containerID="6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3" Mar 10 09:08:45 crc kubenswrapper[4883]: E0310 09:08:45.496979 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\": container with ID starting with 6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3 not found: ID does not exist" containerID="6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.497017 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3"} err="failed to get container status \"6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\": rpc error: code = NotFound desc = could not find container \"6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\": container with ID starting with 6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3 not found: ID does not exist" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.497036 4883 scope.go:117] "RemoveContainer" containerID="0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766" Mar 10 09:08:45 crc kubenswrapper[4883]: E0310 09:08:45.497315 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\": container with ID starting with 0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766 not found: ID does not exist" containerID="0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.497398 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766"} err="failed to get container status \"0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\": rpc error: code = NotFound desc = could not find container \"0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\": container with ID starting with 0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766 not found: ID does not exist" Mar 10 09:08:46 crc kubenswrapper[4883]: I0310 09:08:46.086781 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 10 09:08:46 crc kubenswrapper[4883]: E0310 09:08:46.251251 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="3.2s" Mar 10 09:08:47 crc kubenswrapper[4883]: E0310 09:08:47.405992 4883 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.26.140:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:47 crc kubenswrapper[4883]: I0310 09:08:47.406385 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:47 crc kubenswrapper[4883]: E0310 09:08:47.426437 4883 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.140:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b6fbb0a7e0002 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:08:47.42601933 +0000 UTC m=+313.680917219,LastTimestamp:2026-03-10 09:08:47.42601933 +0000 UTC m=+313.680917219,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:08:48 crc kubenswrapper[4883]: I0310 09:08:48.414656 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97"} Mar 10 09:08:48 crc kubenswrapper[4883]: I0310 09:08:48.415329 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ff954ebccd801344dbf231359e366102e3322c626868177138aa0408ffe61662"} Mar 10 09:08:48 crc kubenswrapper[4883]: I0310 09:08:48.415910 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:48 crc kubenswrapper[4883]: E0310 09:08:48.415935 4883 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.26.140:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:49 crc kubenswrapper[4883]: E0310 09:08:49.172642 4883 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 192.168.26.140:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" volumeName="registry-storage" Mar 10 09:08:49 crc kubenswrapper[4883]: E0310 09:08:49.452030 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="6.4s" Mar 10 09:08:50 crc kubenswrapper[4883]: E0310 09:08:50.286798 4883 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.140:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b6fbb0a7e0002 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:08:47.42601933 +0000 UTC m=+313.680917219,LastTimestamp:2026-03-10 09:08:47.42601933 +0000 UTC m=+313.680917219,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.079110 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.083029 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.083724 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.097091 4883 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.097121 4883 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:08:54 crc kubenswrapper[4883]: E0310 09:08:54.097518 4883 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.098156 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:54 crc kubenswrapper[4883]: W0310 09:08:54.118663 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-389531043a26ad96c805ceda0574d811641b4cb78750239adfc35de16612cc07 WatchSource:0}: Error finding container 389531043a26ad96c805ceda0574d811641b4cb78750239adfc35de16612cc07: Status 404 returned error can't find the container with id 389531043a26ad96c805ceda0574d811641b4cb78750239adfc35de16612cc07 Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.447208 4883 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="8029248d3ffac147bea5f21b2ed5a173034cb51098695d4ac91961b6196bb600" exitCode=0 Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.447283 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"8029248d3ffac147bea5f21b2ed5a173034cb51098695d4ac91961b6196bb600"} Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.447324 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"389531043a26ad96c805ceda0574d811641b4cb78750239adfc35de16612cc07"} Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.447646 4883 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.447670 4883 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:08:54 crc kubenswrapper[4883]: E0310 09:08:54.447989 4883 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.448182 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.455621 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1096c1d9a2e294bd0f61f15e072645e1504385c7def66cf789514a63b6c06f52"} Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.456503 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4adc5f80e7505865a41c60abde887ffad91bc1bbfd89c4ae7c63b44399d82f8d"} Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.456594 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6dadc55f0b768da84020856b89d9fb67477618930f565bc9642fbf5df6feef7c"} Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.456689 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0394f59209da83456f47f287d8b4f4a78fba4b3726b0e9b2ae219187bcb0f9e0"} Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.457212 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9b4abf49a6e034f53ba327df4a6bd7a7b41695dff1f9e643121ab43c2a7ef748"} Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.457327 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.456912 4883 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.457469 4883 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:08:56 crc kubenswrapper[4883]: I0310 09:08:56.464527 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 09:08:56 crc kubenswrapper[4883]: I0310 09:08:56.464910 4883 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d" exitCode=1 Mar 10 09:08:56 crc kubenswrapper[4883]: I0310 09:08:56.464966 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d"} Mar 10 09:08:56 crc kubenswrapper[4883]: I0310 09:08:56.465513 4883 scope.go:117] "RemoveContainer" containerID="bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.081666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.081811 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.083758 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.084191 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.093078 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.099551 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.183544 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.183617 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.183648 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.185375 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.185415 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.195791 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.197377 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.207674 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.209518 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.293491 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.300891 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.306188 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.316122 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.323219 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.336806 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.473497 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.473823 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5a1732939c8e095f503ee44134e4761109af876024b9ab37c7ad425d9188de80"} Mar 10 09:08:57 crc kubenswrapper[4883]: W0310 09:08:57.696036 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-93592c8394a193177fa3ac76ceadaba7a7de8fdca29bbf09ca46c3378064b233 WatchSource:0}: Error finding container 93592c8394a193177fa3ac76ceadaba7a7de8fdca29bbf09ca46c3378064b233: Status 404 returned error can't find the container with id 93592c8394a193177fa3ac76ceadaba7a7de8fdca29bbf09ca46c3378064b233 Mar 10 09:08:57 crc kubenswrapper[4883]: W0310 09:08:57.760453 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-f171ec1b2ea41f28f78790c4aec7dd7ceb311d00891ba24174962b8ccedae6ef WatchSource:0}: Error finding container f171ec1b2ea41f28f78790c4aec7dd7ceb311d00891ba24174962b8ccedae6ef: Status 404 returned error can't find the container with id f171ec1b2ea41f28f78790c4aec7dd7ceb311d00891ba24174962b8ccedae6ef Mar 10 09:08:57 crc kubenswrapper[4883]: W0310 09:08:57.761326 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd6597a3_f861_4126_933e_d6134c8bd4b5.slice/crio-88207ec306f38ab39b5886f82a94d8b243de610e5bdb6dfbbdd0ada1283e2c6a WatchSource:0}: Error finding container 88207ec306f38ab39b5886f82a94d8b243de610e5bdb6dfbbdd0ada1283e2c6a: Status 404 returned error can't find the container with id 88207ec306f38ab39b5886f82a94d8b243de610e5bdb6dfbbdd0ada1283e2c6a Mar 10 09:08:57 crc kubenswrapper[4883]: W0310 09:08:57.818557 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-e8d6bf354911c182324ecc57e5227fb11ece20603ab857a5d5be1df95b79cc79 WatchSource:0}: Error finding container e8d6bf354911c182324ecc57e5227fb11ece20603ab857a5d5be1df95b79cc79: Status 404 returned error can't find the container with id e8d6bf354911c182324ecc57e5227fb11ece20603ab857a5d5be1df95b79cc79 Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.482447 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9339e786ed636c2c5d6bb74a149fe15a84fa7912d4e3ee24ed502f1c95e5fc2b"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.482813 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f171ec1b2ea41f28f78790c4aec7dd7ceb311d00891ba24174962b8ccedae6ef"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.484264 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" event={"ID":"bd6597a3-f861-4126-933e-d6134c8bd4b5","Type":"ContainerStarted","Data":"fb5e6257382d0578ecf8afbbee721052318816ef87140f1c0a174043a46565cb"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.484291 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" event={"ID":"bd6597a3-f861-4126-933e-d6134c8bd4b5","Type":"ContainerStarted","Data":"126f51dfe48fb824b24ab2e48acfd441c2e681f0f96f9335a45ea7e2a76d444d"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.484303 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" event={"ID":"bd6597a3-f861-4126-933e-d6134c8bd4b5","Type":"ContainerStarted","Data":"88207ec306f38ab39b5886f82a94d8b243de610e5bdb6dfbbdd0ada1283e2c6a"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.485749 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8886c140227ff327084d724ef890173f33c85a37c593f364844b5ab4ac1d19a5"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.485776 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"93592c8394a193177fa3ac76ceadaba7a7de8fdca29bbf09ca46c3378064b233"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.487904 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c472bcd9d9c0472f328d6c1f36187b89d0b8d20951fdcd1717c02bd75540191d"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.487932 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e8d6bf354911c182324ecc57e5227fb11ece20603ab857a5d5be1df95b79cc79"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.488232 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.098327 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.098375 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.103486 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.493646 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.493708 4883 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="9339e786ed636c2c5d6bb74a149fe15a84fa7912d4e3ee24ed502f1c95e5fc2b" exitCode=255 Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.493804 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"9339e786ed636c2c5d6bb74a149fe15a84fa7912d4e3ee24ed502f1c95e5fc2b"} Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.494415 4883 scope.go:117] "RemoveContainer" containerID="9339e786ed636c2c5d6bb74a149fe15a84fa7912d4e3ee24ed502f1c95e5fc2b" Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.505209 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.506110 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.506170 4883 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="6fa32de7977b89f96045125a3065be9fd9aab19a84c1cfa8b0dbee07faa2179a" exitCode=255 Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.506214 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"6fa32de7977b89f96045125a3065be9fd9aab19a84c1cfa8b0dbee07faa2179a"} Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.506270 4883 scope.go:117] "RemoveContainer" containerID="9339e786ed636c2c5d6bb74a149fe15a84fa7912d4e3ee24ed502f1c95e5fc2b" Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.507093 4883 scope.go:117] "RemoveContainer" containerID="6fa32de7977b89f96045125a3065be9fd9aab19a84c1cfa8b0dbee07faa2179a" Mar 10 09:09:00 crc kubenswrapper[4883]: E0310 09:09:00.507502 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.748619 4883 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:09:01 crc kubenswrapper[4883]: I0310 09:09:01.515033 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 09:09:01 crc kubenswrapper[4883]: I0310 09:09:01.515491 4883 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:09:01 crc kubenswrapper[4883]: I0310 09:09:01.515513 4883 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:09:01 crc kubenswrapper[4883]: I0310 09:09:01.519060 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:09:02 crc kubenswrapper[4883]: I0310 09:09:02.520727 4883 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:09:02 crc kubenswrapper[4883]: I0310 09:09:02.521334 4883 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:09:04 crc kubenswrapper[4883]: I0310 09:09:04.106309 4883 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ddcb348f-b8d8-4643-9dae-15e77c150ea7" Mar 10 09:09:05 crc kubenswrapper[4883]: I0310 09:09:05.188661 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:09:05 crc kubenswrapper[4883]: I0310 09:09:05.197839 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:09:05 crc kubenswrapper[4883]: I0310 09:09:05.537174 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:09:06 crc kubenswrapper[4883]: I0310 09:09:06.546766 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:09:08 crc kubenswrapper[4883]: I0310 09:09:08.398763 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:09:08 crc kubenswrapper[4883]: I0310 09:09:08.649328 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 09:09:09 crc kubenswrapper[4883]: I0310 09:09:09.370364 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 09:09:09 crc kubenswrapper[4883]: I0310 09:09:09.732573 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 09:09:09 crc kubenswrapper[4883]: I0310 09:09:09.908221 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 09:09:11 crc kubenswrapper[4883]: I0310 09:09:11.631096 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 09:09:11 crc kubenswrapper[4883]: I0310 09:09:11.981692 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.157627 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.284591 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.367209 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.407388 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.471219 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.720775 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.812366 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 09:09:13 crc kubenswrapper[4883]: I0310 09:09:13.298899 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 09:09:13 crc kubenswrapper[4883]: I0310 09:09:13.319921 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 09:09:13 crc kubenswrapper[4883]: I0310 09:09:13.328568 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 09:09:13 crc kubenswrapper[4883]: I0310 09:09:13.664791 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 09:09:13 crc kubenswrapper[4883]: I0310 09:09:13.943798 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 09:09:13 crc kubenswrapper[4883]: I0310 09:09:13.962027 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.086917 4883 scope.go:117] "RemoveContainer" containerID="6fa32de7977b89f96045125a3065be9fd9aab19a84c1cfa8b0dbee07faa2179a" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.212669 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.285695 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.333149 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.584351 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.584687 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b86cdaae54af4f3e1175a1bee3cfee793677557148c998765199876e4808e6c3"} Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.760807 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.791242 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.127235 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.180689 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.266714 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.302947 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.428709 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.465877 4883 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.595020 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.595566 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.595618 4883 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="b86cdaae54af4f3e1175a1bee3cfee793677557148c998765199876e4808e6c3" exitCode=255 Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.595655 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"b86cdaae54af4f3e1175a1bee3cfee793677557148c998765199876e4808e6c3"} Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.595718 4883 scope.go:117] "RemoveContainer" containerID="6fa32de7977b89f96045125a3065be9fd9aab19a84c1cfa8b0dbee07faa2179a" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.596298 4883 scope.go:117] "RemoveContainer" containerID="b86cdaae54af4f3e1175a1bee3cfee793677557148c998765199876e4808e6c3" Mar 10 09:09:15 crc kubenswrapper[4883]: E0310 09:09:15.597264 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.608156 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.676843 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.731231 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.843590 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.902694 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.936495 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.066897 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.107588 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.173674 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.254560 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.419261 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.420398 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.455425 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.470441 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.507896 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.512072 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.520981 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.541391 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.583158 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.603917 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.620697 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.666645 4883 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.936387 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.938738 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.171705 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.182118 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.252761 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.276877 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.336048 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.373060 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.395191 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.518091 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.535904 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.636559 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.756244 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.802504 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.860934 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.861939 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.896378 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.005628 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.160187 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.210252 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.257210 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.260345 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.277201 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.319067 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.365728 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.483357 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.608807 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.711229 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.735307 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.758235 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.789954 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.792612 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.812093 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.827856 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.863092 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.870824 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.003514 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.196261 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.363555 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.423410 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.483333 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.509327 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.520843 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.552152 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.559633 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.648697 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.662006 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.667843 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.670378 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.685815 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.691938 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.706942 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.736364 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.836608 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.020026 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.020584 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.021341 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.066025 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.120552 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.134253 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.140923 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.240532 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.247930 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.269336 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.390416 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.441953 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.450581 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.467467 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.476040 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.585980 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.600243 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.671236 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.717262 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.753340 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.908927 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.913466 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.993789 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.039605 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.063038 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.064624 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.180138 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.258919 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.282764 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.363409 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.553155 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.588801 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.602575 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.641525 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.649831 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.667194 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.667214 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.739535 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.752328 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.759663 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.770314 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.782883 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.921760 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.922374 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.040196 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.055525 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.077044 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.115523 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.119057 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.170102 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.211237 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.392215 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.472252 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.475140 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.507304 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.545543 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.674687 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.810816 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.850256 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.890909 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.895738 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.897101 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.012983 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.074234 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.257693 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.354018 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.607226 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.608898 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.681751 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.682636 4883 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.751073 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.820632 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.880096 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.895578 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.952524 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.099998 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.277629 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.363359 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.378048 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.538929 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.611783 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.777258 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.850277 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.909896 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.924085 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.965612 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.031033 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.036369 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.195731 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.196537 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.208683 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.252181 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.279288 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.378070 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.443204 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.535523 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.651690 4883 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.836966 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.988189 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.045892 4883 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.080878 4883 scope.go:117] "RemoveContainer" containerID="b86cdaae54af4f3e1175a1bee3cfee793677557148c998765199876e4808e6c3" Mar 10 09:09:26 crc kubenswrapper[4883]: E0310 09:09:26.081182 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.103415 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.116886 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.133273 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.171786 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.206990 4883 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.209946 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gmq5n" podStartSLOduration=324.209924245 podStartE2EDuration="5m24.209924245s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:09:00.812254309 +0000 UTC m=+327.067152198" watchObservedRunningTime="2026-03-10 09:09:26.209924245 +0000 UTC m=+352.464822133" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.213016 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.213082 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.213104 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gmq5n"] Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.284004 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=26.283982875 podStartE2EDuration="26.283982875s" podCreationTimestamp="2026-03-10 09:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:09:26.235994613 +0000 UTC m=+352.490892502" watchObservedRunningTime="2026-03-10 09:09:26.283982875 +0000 UTC m=+352.538880765" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.292115 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.405981 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.428302 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.455533 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.467963 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.703010 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.866689 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.872758 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.899305 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.928534 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.153218 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.252497 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.310951 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.322506 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.357737 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.510373 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.528086 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.548673 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.569023 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.695597 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.802808 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.014002 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.018211 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.142899 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.262831 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.592515 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.882859 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.906162 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 09:09:29 crc kubenswrapper[4883]: I0310 09:09:29.134116 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 09:09:29 crc kubenswrapper[4883]: I0310 09:09:29.205626 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 09:09:29 crc kubenswrapper[4883]: I0310 09:09:29.627608 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 09:09:29 crc kubenswrapper[4883]: I0310 09:09:29.779266 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 09:09:30 crc kubenswrapper[4883]: I0310 09:09:30.135637 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 09:09:30 crc kubenswrapper[4883]: I0310 09:09:30.626625 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 09:09:30 crc kubenswrapper[4883]: I0310 09:09:30.863249 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 09:09:31 crc kubenswrapper[4883]: I0310 09:09:31.144993 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 09:09:34 crc kubenswrapper[4883]: I0310 09:09:34.103130 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:09:41 crc kubenswrapper[4883]: I0310 09:09:41.080708 4883 scope.go:117] "RemoveContainer" containerID="b86cdaae54af4f3e1175a1bee3cfee793677557148c998765199876e4808e6c3" Mar 10 09:09:41 crc kubenswrapper[4883]: I0310 09:09:41.773533 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 10 09:09:41 crc kubenswrapper[4883]: I0310 09:09:41.774049 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"29f3b75738c92fd6855829ecb4d34df14b21666f45767a69c52a52332ffbfc0a"} Mar 10 09:09:44 crc kubenswrapper[4883]: I0310 09:09:44.823388 4883 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 09:09:44 crc kubenswrapper[4883]: I0310 09:09:44.824093 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97" gracePeriod=5 Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.378083 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.378432 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.438946 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439000 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439017 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439059 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439086 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439088 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439160 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439194 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439221 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439555 4883 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439570 4883 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439579 4883 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439586 4883 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.445992 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.540680 4883 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.825023 4883 generic.go:334] "Generic (PLEG): container finished" podID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerID="44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907" exitCode=0 Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.825126 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" event={"ID":"74014fb3-ee38-481a-a27f-f12ff7f2c29a","Type":"ContainerDied","Data":"44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907"} Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.825750 4883 scope.go:117] "RemoveContainer" containerID="44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.827223 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.827383 4883 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97" exitCode=137 Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.827438 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.827514 4883 scope.go:117] "RemoveContainer" containerID="2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.844710 4883 scope.go:117] "RemoveContainer" containerID="2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97" Mar 10 09:09:50 crc kubenswrapper[4883]: E0310 09:09:50.845032 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97\": container with ID starting with 2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97 not found: ID does not exist" containerID="2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.845061 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97"} err="failed to get container status \"2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97\": rpc error: code = NotFound desc = could not find container \"2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97\": container with ID starting with 2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97 not found: ID does not exist" Mar 10 09:09:51 crc kubenswrapper[4883]: I0310 09:09:51.838806 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" event={"ID":"74014fb3-ee38-481a-a27f-f12ff7f2c29a","Type":"ContainerStarted","Data":"5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4"} Mar 10 09:09:51 crc kubenswrapper[4883]: I0310 09:09:51.839414 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:09:51 crc kubenswrapper[4883]: I0310 09:09:51.840422 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:09:52 crc kubenswrapper[4883]: I0310 09:09:52.086386 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.156347 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552230-7n2zm"] Mar 10 09:10:00 crc kubenswrapper[4883]: E0310 09:10:00.157235 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" containerName="installer" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.157254 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" containerName="installer" Mar 10 09:10:00 crc kubenswrapper[4883]: E0310 09:10:00.157276 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.157282 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.157435 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.157461 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" containerName="installer" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.158118 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.161351 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.161799 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.162035 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.163612 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552230-7n2zm"] Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.249352 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wqvk\" (UniqueName: \"kubernetes.io/projected/9e54057e-1436-403c-bd92-66dbb888b129-kube-api-access-2wqvk\") pod \"auto-csr-approver-29552230-7n2zm\" (UID: \"9e54057e-1436-403c-bd92-66dbb888b129\") " pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.350916 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wqvk\" (UniqueName: \"kubernetes.io/projected/9e54057e-1436-403c-bd92-66dbb888b129-kube-api-access-2wqvk\") pod \"auto-csr-approver-29552230-7n2zm\" (UID: \"9e54057e-1436-403c-bd92-66dbb888b129\") " pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.369766 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wqvk\" (UniqueName: \"kubernetes.io/projected/9e54057e-1436-403c-bd92-66dbb888b129-kube-api-access-2wqvk\") pod \"auto-csr-approver-29552230-7n2zm\" (UID: \"9e54057e-1436-403c-bd92-66dbb888b129\") " pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.475403 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.905355 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552230-7n2zm"] Mar 10 09:10:01 crc kubenswrapper[4883]: I0310 09:10:01.896799 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" event={"ID":"9e54057e-1436-403c-bd92-66dbb888b129","Type":"ContainerStarted","Data":"767f188be1f37ce6cc0246fa99cfced7d4e1bfc135c1ee0ba6a4061b69e0bdc9"} Mar 10 09:10:02 crc kubenswrapper[4883]: I0310 09:10:02.904430 4883 generic.go:334] "Generic (PLEG): container finished" podID="9e54057e-1436-403c-bd92-66dbb888b129" containerID="7590615cba141d2df532a5f2b91dc13b678e9424c198e88d350b709d2d0d8639" exitCode=0 Mar 10 09:10:02 crc kubenswrapper[4883]: I0310 09:10:02.904490 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" event={"ID":"9e54057e-1436-403c-bd92-66dbb888b129","Type":"ContainerDied","Data":"7590615cba141d2df532a5f2b91dc13b678e9424c198e88d350b709d2d0d8639"} Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.129619 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.307308 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wqvk\" (UniqueName: \"kubernetes.io/projected/9e54057e-1436-403c-bd92-66dbb888b129-kube-api-access-2wqvk\") pod \"9e54057e-1436-403c-bd92-66dbb888b129\" (UID: \"9e54057e-1436-403c-bd92-66dbb888b129\") " Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.314091 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e54057e-1436-403c-bd92-66dbb888b129-kube-api-access-2wqvk" (OuterVolumeSpecName: "kube-api-access-2wqvk") pod "9e54057e-1436-403c-bd92-66dbb888b129" (UID: "9e54057e-1436-403c-bd92-66dbb888b129"). InnerVolumeSpecName "kube-api-access-2wqvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.409584 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wqvk\" (UniqueName: \"kubernetes.io/projected/9e54057e-1436-403c-bd92-66dbb888b129-kube-api-access-2wqvk\") on node \"crc\" DevicePath \"\"" Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.920704 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" event={"ID":"9e54057e-1436-403c-bd92-66dbb888b129","Type":"ContainerDied","Data":"767f188be1f37ce6cc0246fa99cfced7d4e1bfc135c1ee0ba6a4061b69e0bdc9"} Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.920775 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="767f188be1f37ce6cc0246fa99cfced7d4e1bfc135c1ee0ba6a4061b69e0bdc9" Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.920803 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:17 crc kubenswrapper[4883]: I0310 09:10:17.449175 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:10:17 crc kubenswrapper[4883]: I0310 09:10:17.449567 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.300961 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-x4jrh"] Mar 10 09:10:46 crc kubenswrapper[4883]: E0310 09:10:46.302158 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e54057e-1436-403c-bd92-66dbb888b129" containerName="oc" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.302181 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e54057e-1436-403c-bd92-66dbb888b129" containerName="oc" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.302343 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e54057e-1436-403c-bd92-66dbb888b129" containerName="oc" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.302994 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.316714 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-x4jrh"] Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.499929 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw4j6\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-kube-api-access-hw4j6\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500028 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-bound-sa-token\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500089 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500117 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500162 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-trusted-ca\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500318 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-registry-certificates\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500382 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500527 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-registry-tls\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.520826 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602354 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-bound-sa-token\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602417 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602459 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-trusted-ca\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602504 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-registry-certificates\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602525 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602565 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-registry-tls\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602636 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw4j6\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-kube-api-access-hw4j6\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602945 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.603809 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-trusted-ca\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.603938 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-registry-certificates\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.609707 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-registry-tls\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.609904 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.618504 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-bound-sa-token\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.618699 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw4j6\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-kube-api-access-hw4j6\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.621528 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.792468 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-x4jrh"] Mar 10 09:10:47 crc kubenswrapper[4883]: I0310 09:10:47.148288 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" event={"ID":"57ad81e1-b9ef-405b-bb22-5d496f5d56c6","Type":"ContainerStarted","Data":"793cb221e2dd05df8446d3fb0ae59d14621bd59f896a59a8407e93e9eb044c21"} Mar 10 09:10:47 crc kubenswrapper[4883]: I0310 09:10:47.148686 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" event={"ID":"57ad81e1-b9ef-405b-bb22-5d496f5d56c6","Type":"ContainerStarted","Data":"90a4684ca3ee0ed5f5d406ac2b6e0dc42a95002dd843bc4e536da9c9a5bca05f"} Mar 10 09:10:47 crc kubenswrapper[4883]: I0310 09:10:47.148705 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:47 crc kubenswrapper[4883]: I0310 09:10:47.167679 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" podStartSLOduration=1.16765426 podStartE2EDuration="1.16765426s" podCreationTimestamp="2026-03-10 09:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:10:47.165858352 +0000 UTC m=+433.420756241" watchObservedRunningTime="2026-03-10 09:10:47.16765426 +0000 UTC m=+433.422552149" Mar 10 09:10:47 crc kubenswrapper[4883]: I0310 09:10:47.448651 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:10:47 crc kubenswrapper[4883]: I0310 09:10:47.448720 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:11:06 crc kubenswrapper[4883]: I0310 09:11:06.625625 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:11:06 crc kubenswrapper[4883]: I0310 09:11:06.665400 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jv69n"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.660581 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ltgv7"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.663443 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ltgv7" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="registry-server" containerID="cri-o://91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777" gracePeriod=30 Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.667551 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tlhr4"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.667953 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tlhr4" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="registry-server" containerID="cri-o://f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce" gracePeriod=30 Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.675703 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ndt59"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.676045 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" containerID="cri-o://5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4" gracePeriod=30 Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.696171 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h5dv"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.696509 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2h5dv" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="registry-server" containerID="cri-o://88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125" gracePeriod=30 Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.699418 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vhnvt"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.699703 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vhnvt" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="registry-server" containerID="cri-o://c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40" gracePeriod=30 Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.701922 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9d6jf"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.702619 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.716637 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9d6jf"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.880794 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/849aec1a-3ce6-4153-8e52-4bf0185e29e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.880893 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/849aec1a-3ce6-4153-8e52-4bf0185e29e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.880928 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qlfj\" (UniqueName: \"kubernetes.io/projected/849aec1a-3ce6-4153-8e52-4bf0185e29e3-kube-api-access-6qlfj\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.981446 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/849aec1a-3ce6-4153-8e52-4bf0185e29e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.981544 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/849aec1a-3ce6-4153-8e52-4bf0185e29e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.981581 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qlfj\" (UniqueName: \"kubernetes.io/projected/849aec1a-3ce6-4153-8e52-4bf0185e29e3-kube-api-access-6qlfj\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.982872 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/849aec1a-3ce6-4153-8e52-4bf0185e29e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.986701 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/849aec1a-3ce6-4153-8e52-4bf0185e29e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.995631 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qlfj\" (UniqueName: \"kubernetes.io/projected/849aec1a-3ce6-4153-8e52-4bf0185e29e3-kube-api-access-6qlfj\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.017214 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.131757 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.138688 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.144611 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.162254 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.168037 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287182 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djb46\" (UniqueName: \"kubernetes.io/projected/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-kube-api-access-djb46\") pod \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287256 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvxh5\" (UniqueName: \"kubernetes.io/projected/740631be-94cf-4c75-a5a3-0dbd57e2e510-kube-api-access-xvxh5\") pod \"740631be-94cf-4c75-a5a3-0dbd57e2e510\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287285 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-catalog-content\") pod \"740631be-94cf-4c75-a5a3-0dbd57e2e510\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287323 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5m4n\" (UniqueName: \"kubernetes.io/projected/816c3b00-c481-4c08-9691-0244d3c044e3-kube-api-access-p5m4n\") pod \"816c3b00-c481-4c08-9691-0244d3c044e3\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287353 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-utilities\") pod \"7746695d-3e1f-455d-9acc-dffdba42c0d5\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287398 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85hl4\" (UniqueName: \"kubernetes.io/projected/7746695d-3e1f-455d-9acc-dffdba42c0d5-kube-api-access-85hl4\") pod \"7746695d-3e1f-455d-9acc-dffdba42c0d5\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287423 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-utilities\") pod \"816c3b00-c481-4c08-9691-0244d3c044e3\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287446 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-catalog-content\") pod \"7746695d-3e1f-455d-9acc-dffdba42c0d5\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287495 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-operator-metrics\") pod \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287514 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-catalog-content\") pod \"816c3b00-c481-4c08-9691-0244d3c044e3\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287534 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lldvp\" (UniqueName: \"kubernetes.io/projected/74014fb3-ee38-481a-a27f-f12ff7f2c29a-kube-api-access-lldvp\") pod \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287564 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-utilities\") pod \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287594 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-trusted-ca\") pod \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287611 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-catalog-content\") pod \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287630 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-utilities\") pod \"740631be-94cf-4c75-a5a3-0dbd57e2e510\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.288110 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-utilities" (OuterVolumeSpecName: "utilities") pod "7746695d-3e1f-455d-9acc-dffdba42c0d5" (UID: "7746695d-3e1f-455d-9acc-dffdba42c0d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.288562 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-utilities" (OuterVolumeSpecName: "utilities") pod "740631be-94cf-4c75-a5a3-0dbd57e2e510" (UID: "740631be-94cf-4c75-a5a3-0dbd57e2e510"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.288903 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "74014fb3-ee38-481a-a27f-f12ff7f2c29a" (UID: "74014fb3-ee38-481a-a27f-f12ff7f2c29a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.289589 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-utilities" (OuterVolumeSpecName: "utilities") pod "fa724d40-49c8-4d1d-a7e9-5af8f0603e19" (UID: "fa724d40-49c8-4d1d-a7e9-5af8f0603e19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.292879 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-utilities" (OuterVolumeSpecName: "utilities") pod "816c3b00-c481-4c08-9691-0244d3c044e3" (UID: "816c3b00-c481-4c08-9691-0244d3c044e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.294645 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7746695d-3e1f-455d-9acc-dffdba42c0d5-kube-api-access-85hl4" (OuterVolumeSpecName: "kube-api-access-85hl4") pod "7746695d-3e1f-455d-9acc-dffdba42c0d5" (UID: "7746695d-3e1f-455d-9acc-dffdba42c0d5"). InnerVolumeSpecName "kube-api-access-85hl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.294921 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "74014fb3-ee38-481a-a27f-f12ff7f2c29a" (UID: "74014fb3-ee38-481a-a27f-f12ff7f2c29a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.295137 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/816c3b00-c481-4c08-9691-0244d3c044e3-kube-api-access-p5m4n" (OuterVolumeSpecName: "kube-api-access-p5m4n") pod "816c3b00-c481-4c08-9691-0244d3c044e3" (UID: "816c3b00-c481-4c08-9691-0244d3c044e3"). InnerVolumeSpecName "kube-api-access-p5m4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.295871 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-kube-api-access-djb46" (OuterVolumeSpecName: "kube-api-access-djb46") pod "fa724d40-49c8-4d1d-a7e9-5af8f0603e19" (UID: "fa724d40-49c8-4d1d-a7e9-5af8f0603e19"). InnerVolumeSpecName "kube-api-access-djb46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.299692 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740631be-94cf-4c75-a5a3-0dbd57e2e510-kube-api-access-xvxh5" (OuterVolumeSpecName: "kube-api-access-xvxh5") pod "740631be-94cf-4c75-a5a3-0dbd57e2e510" (UID: "740631be-94cf-4c75-a5a3-0dbd57e2e510"). InnerVolumeSpecName "kube-api-access-xvxh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.300198 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74014fb3-ee38-481a-a27f-f12ff7f2c29a-kube-api-access-lldvp" (OuterVolumeSpecName: "kube-api-access-lldvp") pod "74014fb3-ee38-481a-a27f-f12ff7f2c29a" (UID: "74014fb3-ee38-481a-a27f-f12ff7f2c29a"). InnerVolumeSpecName "kube-api-access-lldvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.312466 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7746695d-3e1f-455d-9acc-dffdba42c0d5" (UID: "7746695d-3e1f-455d-9acc-dffdba42c0d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.342995 4883 generic.go:334] "Generic (PLEG): container finished" podID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerID="5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4" exitCode=0 Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.343057 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" event={"ID":"74014fb3-ee38-481a-a27f-f12ff7f2c29a","Type":"ContainerDied","Data":"5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.343431 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" event={"ID":"74014fb3-ee38-481a-a27f-f12ff7f2c29a","Type":"ContainerDied","Data":"ee1e42ffe97556105d0510c897a1238a2dd105fd96a60722e66b11e2fc0634b8"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.343610 4883 scope.go:117] "RemoveContainer" containerID="5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.343074 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.346100 4883 generic.go:334] "Generic (PLEG): container finished" podID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerID="88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125" exitCode=0 Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.346146 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.346186 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h5dv" event={"ID":"7746695d-3e1f-455d-9acc-dffdba42c0d5","Type":"ContainerDied","Data":"88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.346225 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h5dv" event={"ID":"7746695d-3e1f-455d-9acc-dffdba42c0d5","Type":"ContainerDied","Data":"7cd4d72ef0244e1c6f3955303b46c7d75041bd13eacfaf569a15ddb645d99b32"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.350067 4883 generic.go:334] "Generic (PLEG): container finished" podID="816c3b00-c481-4c08-9691-0244d3c044e3" containerID="91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777" exitCode=0 Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.350103 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltgv7" event={"ID":"816c3b00-c481-4c08-9691-0244d3c044e3","Type":"ContainerDied","Data":"91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.350146 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltgv7" event={"ID":"816c3b00-c481-4c08-9691-0244d3c044e3","Type":"ContainerDied","Data":"f5c87addac3f89a4858d25eb6fa3c57863872b10777952494e3f153096638f60"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.350174 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.352176 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "816c3b00-c481-4c08-9691-0244d3c044e3" (UID: "816c3b00-c481-4c08-9691-0244d3c044e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.352897 4883 generic.go:334] "Generic (PLEG): container finished" podID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerID="f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce" exitCode=0 Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.352960 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerDied","Data":"f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.352979 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerDied","Data":"bb7cd6f6d9fdd71abf19936362d7bad0ceebe476125419e2660408cc85f60192"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.352977 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.357666 4883 scope.go:117] "RemoveContainer" containerID="44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.358869 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa724d40-49c8-4d1d-a7e9-5af8f0603e19" (UID: "fa724d40-49c8-4d1d-a7e9-5af8f0603e19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.360024 4883 generic.go:334] "Generic (PLEG): container finished" podID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerID="c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40" exitCode=0 Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.360073 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhnvt" event={"ID":"740631be-94cf-4c75-a5a3-0dbd57e2e510","Type":"ContainerDied","Data":"c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.360094 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.360107 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhnvt" event={"ID":"740631be-94cf-4c75-a5a3-0dbd57e2e510","Type":"ContainerDied","Data":"7617eec4e807a31ae8dae401f57247ee0d7df593c7506b5c96f9dc3caf16e27a"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.374424 4883 scope.go:117] "RemoveContainer" containerID="5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.374964 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4\": container with ID starting with 5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4 not found: ID does not exist" containerID="5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.375009 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4"} err="failed to get container status \"5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4\": rpc error: code = NotFound desc = could not find container \"5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4\": container with ID starting with 5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.375041 4883 scope.go:117] "RemoveContainer" containerID="44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.375389 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907\": container with ID starting with 44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907 not found: ID does not exist" containerID="44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.375431 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907"} err="failed to get container status \"44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907\": rpc error: code = NotFound desc = could not find container \"44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907\": container with ID starting with 44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.375459 4883 scope.go:117] "RemoveContainer" containerID="88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.383976 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h5dv"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389094 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h5dv"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389513 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389545 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389559 4883 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389572 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389584 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lldvp\" (UniqueName: \"kubernetes.io/projected/74014fb3-ee38-481a-a27f-f12ff7f2c29a-kube-api-access-lldvp\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389595 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389607 4883 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389618 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389628 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389639 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djb46\" (UniqueName: \"kubernetes.io/projected/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-kube-api-access-djb46\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389649 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvxh5\" (UniqueName: \"kubernetes.io/projected/740631be-94cf-4c75-a5a3-0dbd57e2e510-kube-api-access-xvxh5\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389660 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5m4n\" (UniqueName: \"kubernetes.io/projected/816c3b00-c481-4c08-9691-0244d3c044e3-kube-api-access-p5m4n\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389671 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389682 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85hl4\" (UniqueName: \"kubernetes.io/projected/7746695d-3e1f-455d-9acc-dffdba42c0d5-kube-api-access-85hl4\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.390611 4883 scope.go:117] "RemoveContainer" containerID="055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.391690 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ndt59"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.394171 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ndt59"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.408773 4883 scope.go:117] "RemoveContainer" containerID="8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.421982 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "740631be-94cf-4c75-a5a3-0dbd57e2e510" (UID: "740631be-94cf-4c75-a5a3-0dbd57e2e510"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.424981 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9d6jf"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.426402 4883 scope.go:117] "RemoveContainer" containerID="88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.427694 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125\": container with ID starting with 88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125 not found: ID does not exist" containerID="88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.427739 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125"} err="failed to get container status \"88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125\": rpc error: code = NotFound desc = could not find container \"88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125\": container with ID starting with 88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.427768 4883 scope.go:117] "RemoveContainer" containerID="055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.429053 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98\": container with ID starting with 055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98 not found: ID does not exist" containerID="055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.429389 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98"} err="failed to get container status \"055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98\": rpc error: code = NotFound desc = could not find container \"055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98\": container with ID starting with 055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.429789 4883 scope.go:117] "RemoveContainer" containerID="8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.431259 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df\": container with ID starting with 8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df not found: ID does not exist" containerID="8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.431300 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df"} err="failed to get container status \"8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df\": rpc error: code = NotFound desc = could not find container \"8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df\": container with ID starting with 8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.431339 4883 scope.go:117] "RemoveContainer" containerID="91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.466918 4883 scope.go:117] "RemoveContainer" containerID="a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.489443 4883 scope.go:117] "RemoveContainer" containerID="a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.490332 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.509445 4883 scope.go:117] "RemoveContainer" containerID="91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.509910 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777\": container with ID starting with 91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777 not found: ID does not exist" containerID="91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.509942 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777"} err="failed to get container status \"91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777\": rpc error: code = NotFound desc = could not find container \"91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777\": container with ID starting with 91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.509974 4883 scope.go:117] "RemoveContainer" containerID="a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.510253 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f\": container with ID starting with a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f not found: ID does not exist" containerID="a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.510290 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f"} err="failed to get container status \"a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f\": rpc error: code = NotFound desc = could not find container \"a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f\": container with ID starting with a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.510315 4883 scope.go:117] "RemoveContainer" containerID="a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.510994 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656\": container with ID starting with a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656 not found: ID does not exist" containerID="a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.511021 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656"} err="failed to get container status \"a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656\": rpc error: code = NotFound desc = could not find container \"a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656\": container with ID starting with a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.511036 4883 scope.go:117] "RemoveContainer" containerID="f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.523526 4883 scope.go:117] "RemoveContainer" containerID="4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.538106 4883 scope.go:117] "RemoveContainer" containerID="0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.551016 4883 scope.go:117] "RemoveContainer" containerID="f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.551898 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce\": container with ID starting with f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce not found: ID does not exist" containerID="f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.552402 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce"} err="failed to get container status \"f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce\": rpc error: code = NotFound desc = could not find container \"f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce\": container with ID starting with f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.552434 4883 scope.go:117] "RemoveContainer" containerID="4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.553297 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6\": container with ID starting with 4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6 not found: ID does not exist" containerID="4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.553342 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6"} err="failed to get container status \"4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6\": rpc error: code = NotFound desc = could not find container \"4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6\": container with ID starting with 4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.553371 4883 scope.go:117] "RemoveContainer" containerID="0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.553721 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722\": container with ID starting with 0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722 not found: ID does not exist" containerID="0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.553747 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722"} err="failed to get container status \"0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722\": rpc error: code = NotFound desc = could not find container \"0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722\": container with ID starting with 0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.553763 4883 scope.go:117] "RemoveContainer" containerID="c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.565555 4883 scope.go:117] "RemoveContainer" containerID="db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.580633 4883 scope.go:117] "RemoveContainer" containerID="50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.593646 4883 scope.go:117] "RemoveContainer" containerID="c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.594178 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40\": container with ID starting with c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40 not found: ID does not exist" containerID="c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.594216 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40"} err="failed to get container status \"c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40\": rpc error: code = NotFound desc = could not find container \"c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40\": container with ID starting with c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.594240 4883 scope.go:117] "RemoveContainer" containerID="db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.594703 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461\": container with ID starting with db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461 not found: ID does not exist" containerID="db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.594730 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461"} err="failed to get container status \"db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461\": rpc error: code = NotFound desc = could not find container \"db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461\": container with ID starting with db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.594746 4883 scope.go:117] "RemoveContainer" containerID="50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.595045 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7\": container with ID starting with 50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7 not found: ID does not exist" containerID="50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.595071 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7"} err="failed to get container status \"50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7\": rpc error: code = NotFound desc = could not find container \"50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7\": container with ID starting with 50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.682094 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ltgv7"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.689991 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ltgv7"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.697770 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vhnvt"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.702883 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vhnvt"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.709373 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tlhr4"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.712761 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tlhr4"] Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.369129 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" event={"ID":"849aec1a-3ce6-4153-8e52-4bf0185e29e3","Type":"ContainerStarted","Data":"65cb93f8e74db626d2881bca925a73ba4c6522ff6ae37af7a0978435830c4335"} Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.369195 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" event={"ID":"849aec1a-3ce6-4153-8e52-4bf0185e29e3","Type":"ContainerStarted","Data":"0d2c937827935f30f304975b437d057b684610d1efd3234a72edd6fe96d7dbcc"} Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.369662 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.375462 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.387183 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" podStartSLOduration=2.387159968 podStartE2EDuration="2.387159968s" podCreationTimestamp="2026-03-10 09:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:11:17.383108318 +0000 UTC m=+463.638006206" watchObservedRunningTime="2026-03-10 09:11:17.387159968 +0000 UTC m=+463.642057857" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.449155 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.449407 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.449518 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.450198 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.450270 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a" gracePeriod=600 Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.534578 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99873383_15b6_42ee_a65f_7917294d2e02.slice/crio-dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99873383_15b6_42ee_a65f_7917294d2e02.slice/crio-conmon-dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874268 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v6jt6"] Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874687 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874702 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874710 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874716 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874727 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874735 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874743 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874749 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874759 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874765 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874773 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874779 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874787 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874792 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874801 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874807 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874813 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874819 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874827 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874832 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874838 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874844 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874851 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874858 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874877 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874883 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874891 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874897 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874993 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.875004 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.875016 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.875022 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.875030 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.875038 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.875788 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.877319 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.884795 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6jt6"] Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.014311 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790ba2f9-1214-4040-a140-0663e2b869b1-utilities\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.014373 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bjdw\" (UniqueName: \"kubernetes.io/projected/790ba2f9-1214-4040-a140-0663e2b869b1-kube-api-access-6bjdw\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.014465 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790ba2f9-1214-4040-a140-0663e2b869b1-catalog-content\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.074507 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g87df"] Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.076143 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.081811 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.089666 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" path="/var/lib/kubelet/pods/74014fb3-ee38-481a-a27f-f12ff7f2c29a/volumes" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.090532 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" path="/var/lib/kubelet/pods/740631be-94cf-4c75-a5a3-0dbd57e2e510/volumes" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.091142 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" path="/var/lib/kubelet/pods/7746695d-3e1f-455d-9acc-dffdba42c0d5/volumes" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.091727 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" path="/var/lib/kubelet/pods/816c3b00-c481-4c08-9691-0244d3c044e3/volumes" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.092293 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" path="/var/lib/kubelet/pods/fa724d40-49c8-4d1d-a7e9-5af8f0603e19/volumes" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.097649 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g87df"] Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.115961 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790ba2f9-1214-4040-a140-0663e2b869b1-utilities\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.116025 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bjdw\" (UniqueName: \"kubernetes.io/projected/790ba2f9-1214-4040-a140-0663e2b869b1-kube-api-access-6bjdw\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.116084 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790ba2f9-1214-4040-a140-0663e2b869b1-catalog-content\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.116467 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790ba2f9-1214-4040-a140-0663e2b869b1-utilities\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.116578 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790ba2f9-1214-4040-a140-0663e2b869b1-catalog-content\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.134180 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bjdw\" (UniqueName: \"kubernetes.io/projected/790ba2f9-1214-4040-a140-0663e2b869b1-kube-api-access-6bjdw\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.189041 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.217027 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s42bf\" (UniqueName: \"kubernetes.io/projected/06556553-1ab9-4217-ad98-679ff31feaf9-kube-api-access-s42bf\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.217078 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06556553-1ab9-4217-ad98-679ff31feaf9-catalog-content\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.217128 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06556553-1ab9-4217-ad98-679ff31feaf9-utilities\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.318805 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s42bf\" (UniqueName: \"kubernetes.io/projected/06556553-1ab9-4217-ad98-679ff31feaf9-kube-api-access-s42bf\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.318852 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06556553-1ab9-4217-ad98-679ff31feaf9-catalog-content\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.318906 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06556553-1ab9-4217-ad98-679ff31feaf9-utilities\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.319922 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06556553-1ab9-4217-ad98-679ff31feaf9-catalog-content\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.319938 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06556553-1ab9-4217-ad98-679ff31feaf9-utilities\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.333820 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s42bf\" (UniqueName: \"kubernetes.io/projected/06556553-1ab9-4217-ad98-679ff31feaf9-kube-api-access-s42bf\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.383813 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a" exitCode=0 Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.383918 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a"} Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.383995 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"fa3cbb52c5196a50f8eb140640d5ec9255382224cac25f2663c634e049543e13"} Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.384022 4883 scope.go:117] "RemoveContainer" containerID="3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.393565 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:18.543253 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6jt6"] Mar 10 09:11:19 crc kubenswrapper[4883]: W0310 09:11:18.555455 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790ba2f9_1214_4040_a140_0663e2b869b1.slice/crio-00155b5c512dae8d1115e236d8a73c737796c6264f06ea4714abd0a014db45bf WatchSource:0}: Error finding container 00155b5c512dae8d1115e236d8a73c737796c6264f06ea4714abd0a014db45bf: Status 404 returned error can't find the container with id 00155b5c512dae8d1115e236d8a73c737796c6264f06ea4714abd0a014db45bf Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:18.782374 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g87df"] Mar 10 09:11:19 crc kubenswrapper[4883]: W0310 09:11:18.788084 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06556553_1ab9_4217_ad98_679ff31feaf9.slice/crio-fdd3d35ca65e39ed1b3a4bdf4797e16f0e4bbfc4522d57e076f72cff3456ee94 WatchSource:0}: Error finding container fdd3d35ca65e39ed1b3a4bdf4797e16f0e4bbfc4522d57e076f72cff3456ee94: Status 404 returned error can't find the container with id fdd3d35ca65e39ed1b3a4bdf4797e16f0e4bbfc4522d57e076f72cff3456ee94 Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:19.390423 4883 generic.go:334] "Generic (PLEG): container finished" podID="790ba2f9-1214-4040-a140-0663e2b869b1" containerID="d1c1f129c5a12787ef11b631be04a777292a30f6fd8961f2da928f0c69a8fb18" exitCode=0 Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:19.390538 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6jt6" event={"ID":"790ba2f9-1214-4040-a140-0663e2b869b1","Type":"ContainerDied","Data":"d1c1f129c5a12787ef11b631be04a777292a30f6fd8961f2da928f0c69a8fb18"} Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:19.390799 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6jt6" event={"ID":"790ba2f9-1214-4040-a140-0663e2b869b1","Type":"ContainerStarted","Data":"00155b5c512dae8d1115e236d8a73c737796c6264f06ea4714abd0a014db45bf"} Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:19.392180 4883 generic.go:334] "Generic (PLEG): container finished" podID="06556553-1ab9-4217-ad98-679ff31feaf9" containerID="d7d62049e5d70020d0e4432a921c480056f1ddc660f983393b3160d09b750bfc" exitCode=0 Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:19.392242 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g87df" event={"ID":"06556553-1ab9-4217-ad98-679ff31feaf9","Type":"ContainerDied","Data":"d7d62049e5d70020d0e4432a921c480056f1ddc660f983393b3160d09b750bfc"} Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:19.392273 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g87df" event={"ID":"06556553-1ab9-4217-ad98-679ff31feaf9","Type":"ContainerStarted","Data":"fdd3d35ca65e39ed1b3a4bdf4797e16f0e4bbfc4522d57e076f72cff3456ee94"} Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.281165 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p7kbr"] Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.282452 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.283678 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7kbr"] Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.284117 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.403739 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6jt6" event={"ID":"790ba2f9-1214-4040-a140-0663e2b869b1","Type":"ContainerStarted","Data":"b57e3b9dc79342cf020637a710b7a7110bc519e6f310e4e36097cf3a2ad58157"} Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.407144 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g87df" event={"ID":"06556553-1ab9-4217-ad98-679ff31feaf9","Type":"ContainerStarted","Data":"063421c6ebb33f3707f60107cb478af0622a4ad32041380fff67ec33cab7b5fc"} Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.449748 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43173ae-a262-4efa-8141-419be6d01b7d-catalog-content\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.449831 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43173ae-a262-4efa-8141-419be6d01b7d-utilities\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.449916 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbw6\" (UniqueName: \"kubernetes.io/projected/f43173ae-a262-4efa-8141-419be6d01b7d-kube-api-access-ffbw6\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.469790 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gqg54"] Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.470769 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.474831 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.479720 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqg54"] Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.551500 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43173ae-a262-4efa-8141-419be6d01b7d-catalog-content\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.551556 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43173ae-a262-4efa-8141-419be6d01b7d-utilities\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.552029 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43173ae-a262-4efa-8141-419be6d01b7d-catalog-content\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.552066 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43173ae-a262-4efa-8141-419be6d01b7d-utilities\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.552104 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbw6\" (UniqueName: \"kubernetes.io/projected/f43173ae-a262-4efa-8141-419be6d01b7d-kube-api-access-ffbw6\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.568738 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbw6\" (UniqueName: \"kubernetes.io/projected/f43173ae-a262-4efa-8141-419be6d01b7d-kube-api-access-ffbw6\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.604529 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.652822 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7df241-6476-44a7-a800-921897b7e381-catalog-content\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.652890 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7df241-6476-44a7-a800-921897b7e381-utilities\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.652975 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffkw8\" (UniqueName: \"kubernetes.io/projected/8e7df241-6476-44a7-a800-921897b7e381-kube-api-access-ffkw8\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.754592 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7df241-6476-44a7-a800-921897b7e381-catalog-content\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.755127 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7df241-6476-44a7-a800-921897b7e381-catalog-content\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.755585 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7df241-6476-44a7-a800-921897b7e381-utilities\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.755872 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7df241-6476-44a7-a800-921897b7e381-utilities\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.755977 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffkw8\" (UniqueName: \"kubernetes.io/projected/8e7df241-6476-44a7-a800-921897b7e381-kube-api-access-ffkw8\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.772425 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffkw8\" (UniqueName: \"kubernetes.io/projected/8e7df241-6476-44a7-a800-921897b7e381-kube-api-access-ffkw8\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.835710 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.954260 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7kbr"] Mar 10 09:11:20 crc kubenswrapper[4883]: W0310 09:11:20.960048 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf43173ae_a262_4efa_8141_419be6d01b7d.slice/crio-abecc8eee2fc4b23ee465ef889874fbc8a7ea8ef0ec908e1057df4645f2ebca9 WatchSource:0}: Error finding container abecc8eee2fc4b23ee465ef889874fbc8a7ea8ef0ec908e1057df4645f2ebca9: Status 404 returned error can't find the container with id abecc8eee2fc4b23ee465ef889874fbc8a7ea8ef0ec908e1057df4645f2ebca9 Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.207775 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqg54"] Mar 10 09:11:21 crc kubenswrapper[4883]: W0310 09:11:21.212954 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7df241_6476_44a7_a800_921897b7e381.slice/crio-80ebfddbe5d4c043f26dda09827ea3b9f4b82570810beff570f305d8620df8da WatchSource:0}: Error finding container 80ebfddbe5d4c043f26dda09827ea3b9f4b82570810beff570f305d8620df8da: Status 404 returned error can't find the container with id 80ebfddbe5d4c043f26dda09827ea3b9f4b82570810beff570f305d8620df8da Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.415541 4883 generic.go:334] "Generic (PLEG): container finished" podID="790ba2f9-1214-4040-a140-0663e2b869b1" containerID="b57e3b9dc79342cf020637a710b7a7110bc519e6f310e4e36097cf3a2ad58157" exitCode=0 Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.415652 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6jt6" event={"ID":"790ba2f9-1214-4040-a140-0663e2b869b1","Type":"ContainerDied","Data":"b57e3b9dc79342cf020637a710b7a7110bc519e6f310e4e36097cf3a2ad58157"} Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.417506 4883 generic.go:334] "Generic (PLEG): container finished" podID="06556553-1ab9-4217-ad98-679ff31feaf9" containerID="063421c6ebb33f3707f60107cb478af0622a4ad32041380fff67ec33cab7b5fc" exitCode=0 Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.417575 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g87df" event={"ID":"06556553-1ab9-4217-ad98-679ff31feaf9","Type":"ContainerDied","Data":"063421c6ebb33f3707f60107cb478af0622a4ad32041380fff67ec33cab7b5fc"} Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.420032 4883 generic.go:334] "Generic (PLEG): container finished" podID="8e7df241-6476-44a7-a800-921897b7e381" containerID="8843f3fb95d798ba8a042916ea39f2877a0b087180e5dd8510d27c412884048c" exitCode=0 Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.420130 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqg54" event={"ID":"8e7df241-6476-44a7-a800-921897b7e381","Type":"ContainerDied","Data":"8843f3fb95d798ba8a042916ea39f2877a0b087180e5dd8510d27c412884048c"} Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.420168 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqg54" event={"ID":"8e7df241-6476-44a7-a800-921897b7e381","Type":"ContainerStarted","Data":"80ebfddbe5d4c043f26dda09827ea3b9f4b82570810beff570f305d8620df8da"} Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.421549 4883 generic.go:334] "Generic (PLEG): container finished" podID="f43173ae-a262-4efa-8141-419be6d01b7d" containerID="11328ab9bfc39030bd2e5f157c6fd6fb571958651803563ee07642b9b4f4289d" exitCode=0 Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.421583 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7kbr" event={"ID":"f43173ae-a262-4efa-8141-419be6d01b7d","Type":"ContainerDied","Data":"11328ab9bfc39030bd2e5f157c6fd6fb571958651803563ee07642b9b4f4289d"} Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.421615 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7kbr" event={"ID":"f43173ae-a262-4efa-8141-419be6d01b7d","Type":"ContainerStarted","Data":"abecc8eee2fc4b23ee465ef889874fbc8a7ea8ef0ec908e1057df4645f2ebca9"} Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.432999 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqg54" event={"ID":"8e7df241-6476-44a7-a800-921897b7e381","Type":"ContainerStarted","Data":"ba8fc165cfc7a7c565522559d94192a1515f7552c9f0775863a01e383f590805"} Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.435572 4883 generic.go:334] "Generic (PLEG): container finished" podID="f43173ae-a262-4efa-8141-419be6d01b7d" containerID="8c9fa15df75782b50140862b92a1a092d40d1ff07e077a68309ff912f8925ab0" exitCode=0 Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.435628 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7kbr" event={"ID":"f43173ae-a262-4efa-8141-419be6d01b7d","Type":"ContainerDied","Data":"8c9fa15df75782b50140862b92a1a092d40d1ff07e077a68309ff912f8925ab0"} Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.439378 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6jt6" event={"ID":"790ba2f9-1214-4040-a140-0663e2b869b1","Type":"ContainerStarted","Data":"55de9c65aab5bd1c255a807d31442d5ee4e3aa60b789dce327cd95f8385bc6bc"} Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.442006 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g87df" event={"ID":"06556553-1ab9-4217-ad98-679ff31feaf9","Type":"ContainerStarted","Data":"292eae0399f0d439e5a73bfdd16a81dd3749c12520ff10d70fcc84b78ed738df"} Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.468552 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g87df" podStartSLOduration=1.748040482 podStartE2EDuration="4.46853128s" podCreationTimestamp="2026-03-10 09:11:18 +0000 UTC" firstStartedPulling="2026-03-10 09:11:19.393908612 +0000 UTC m=+465.648806501" lastFinishedPulling="2026-03-10 09:11:22.114399409 +0000 UTC m=+468.369297299" observedRunningTime="2026-03-10 09:11:22.464664128 +0000 UTC m=+468.719562017" watchObservedRunningTime="2026-03-10 09:11:22.46853128 +0000 UTC m=+468.723429169" Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.499550 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v6jt6" podStartSLOduration=2.906578505 podStartE2EDuration="5.499533035s" podCreationTimestamp="2026-03-10 09:11:17 +0000 UTC" firstStartedPulling="2026-03-10 09:11:19.392748715 +0000 UTC m=+465.647646603" lastFinishedPulling="2026-03-10 09:11:21.985703243 +0000 UTC m=+468.240601133" observedRunningTime="2026-03-10 09:11:22.495392307 +0000 UTC m=+468.750290207" watchObservedRunningTime="2026-03-10 09:11:22.499533035 +0000 UTC m=+468.754430925" Mar 10 09:11:23 crc kubenswrapper[4883]: I0310 09:11:23.452380 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7kbr" event={"ID":"f43173ae-a262-4efa-8141-419be6d01b7d","Type":"ContainerStarted","Data":"fd5bfa6abee8295036504b162c76c120d8beadf3fd59e0557255941439737175"} Mar 10 09:11:23 crc kubenswrapper[4883]: I0310 09:11:23.456207 4883 generic.go:334] "Generic (PLEG): container finished" podID="8e7df241-6476-44a7-a800-921897b7e381" containerID="ba8fc165cfc7a7c565522559d94192a1515f7552c9f0775863a01e383f590805" exitCode=0 Mar 10 09:11:23 crc kubenswrapper[4883]: I0310 09:11:23.456257 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqg54" event={"ID":"8e7df241-6476-44a7-a800-921897b7e381","Type":"ContainerDied","Data":"ba8fc165cfc7a7c565522559d94192a1515f7552c9f0775863a01e383f590805"} Mar 10 09:11:23 crc kubenswrapper[4883]: I0310 09:11:23.473273 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p7kbr" podStartSLOduration=1.945969673 podStartE2EDuration="3.473252081s" podCreationTimestamp="2026-03-10 09:11:20 +0000 UTC" firstStartedPulling="2026-03-10 09:11:21.423098062 +0000 UTC m=+467.677995951" lastFinishedPulling="2026-03-10 09:11:22.95038047 +0000 UTC m=+469.205278359" observedRunningTime="2026-03-10 09:11:23.472379516 +0000 UTC m=+469.727277405" watchObservedRunningTime="2026-03-10 09:11:23.473252081 +0000 UTC m=+469.728149970" Mar 10 09:11:24 crc kubenswrapper[4883]: I0310 09:11:24.463708 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqg54" event={"ID":"8e7df241-6476-44a7-a800-921897b7e381","Type":"ContainerStarted","Data":"5aa6d0e2366b6a6450d2e13b1ecaf223b1e41665d4ed3461495ce272366c692c"} Mar 10 09:11:24 crc kubenswrapper[4883]: I0310 09:11:24.485629 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gqg54" podStartSLOduration=1.93511128 podStartE2EDuration="4.48560823s" podCreationTimestamp="2026-03-10 09:11:20 +0000 UTC" firstStartedPulling="2026-03-10 09:11:21.421167382 +0000 UTC m=+467.676065271" lastFinishedPulling="2026-03-10 09:11:23.971664332 +0000 UTC m=+470.226562221" observedRunningTime="2026-03-10 09:11:24.481649415 +0000 UTC m=+470.736547305" watchObservedRunningTime="2026-03-10 09:11:24.48560823 +0000 UTC m=+470.740506119" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.189821 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.190587 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.224776 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.394132 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.394209 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.427710 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.516169 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.516627 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:30 crc kubenswrapper[4883]: I0310 09:11:30.605059 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:30 crc kubenswrapper[4883]: I0310 09:11:30.605205 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:30 crc kubenswrapper[4883]: I0310 09:11:30.637569 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:30 crc kubenswrapper[4883]: I0310 09:11:30.835983 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:30 crc kubenswrapper[4883]: I0310 09:11:30.836054 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:30 crc kubenswrapper[4883]: I0310 09:11:30.869055 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:31 crc kubenswrapper[4883]: I0310 09:11:31.536853 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:31 crc kubenswrapper[4883]: I0310 09:11:31.540085 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:31 crc kubenswrapper[4883]: I0310 09:11:31.698270 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" podUID="7bfdcb1d-e416-438c-9916-5c42cf35f2eb" containerName="registry" containerID="cri-o://7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431" gracePeriod=30 Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.044314 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208009 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-tls\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208092 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9wjh\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-kube-api-access-q9wjh\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208242 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208280 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-trusted-ca\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208308 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-certificates\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208333 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-ca-trust-extracted\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208390 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-bound-sa-token\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208420 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-installation-pull-secrets\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.210014 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.210108 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.217365 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-kube-api-access-q9wjh" (OuterVolumeSpecName: "kube-api-access-q9wjh") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "kube-api-access-q9wjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.217466 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.217805 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.217991 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.223870 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.231610 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311061 4883 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311110 4883 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311128 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9wjh\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-kube-api-access-q9wjh\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311140 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311151 4883 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311160 4883 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311169 4883 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.512456 4883 generic.go:334] "Generic (PLEG): container finished" podID="7bfdcb1d-e416-438c-9916-5c42cf35f2eb" containerID="7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431" exitCode=0 Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.512609 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.512609 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" event={"ID":"7bfdcb1d-e416-438c-9916-5c42cf35f2eb","Type":"ContainerDied","Data":"7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431"} Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.513108 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" event={"ID":"7bfdcb1d-e416-438c-9916-5c42cf35f2eb","Type":"ContainerDied","Data":"ba9d597cdd4e690659606d934bb4d1fb3e310147327af93f1ac8149f438281d6"} Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.513142 4883 scope.go:117] "RemoveContainer" containerID="7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.536564 4883 scope.go:117] "RemoveContainer" containerID="7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431" Mar 10 09:11:32 crc kubenswrapper[4883]: E0310 09:11:32.537186 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431\": container with ID starting with 7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431 not found: ID does not exist" containerID="7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.537223 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431"} err="failed to get container status \"7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431\": rpc error: code = NotFound desc = could not find container \"7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431\": container with ID starting with 7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431 not found: ID does not exist" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.558549 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jv69n"] Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.561534 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jv69n"] Mar 10 09:11:34 crc kubenswrapper[4883]: I0310 09:11:34.087615 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bfdcb1d-e416-438c-9916-5c42cf35f2eb" path="/var/lib/kubelet/pods/7bfdcb1d-e416-438c-9916-5c42cf35f2eb/volumes" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.133750 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552232-d429x"] Mar 10 09:12:00 crc kubenswrapper[4883]: E0310 09:12:00.134538 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bfdcb1d-e416-438c-9916-5c42cf35f2eb" containerName="registry" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.134554 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bfdcb1d-e416-438c-9916-5c42cf35f2eb" containerName="registry" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.134661 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bfdcb1d-e416-438c-9916-5c42cf35f2eb" containerName="registry" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.135084 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.137660 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.137799 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.137919 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.139773 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552232-d429x"] Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.226140 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vxd9\" (UniqueName: \"kubernetes.io/projected/bbf9a36e-c0e2-4943-a87c-9f6735b2714e-kube-api-access-5vxd9\") pod \"auto-csr-approver-29552232-d429x\" (UID: \"bbf9a36e-c0e2-4943-a87c-9f6735b2714e\") " pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.327790 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vxd9\" (UniqueName: \"kubernetes.io/projected/bbf9a36e-c0e2-4943-a87c-9f6735b2714e-kube-api-access-5vxd9\") pod \"auto-csr-approver-29552232-d429x\" (UID: \"bbf9a36e-c0e2-4943-a87c-9f6735b2714e\") " pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.346536 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vxd9\" (UniqueName: \"kubernetes.io/projected/bbf9a36e-c0e2-4943-a87c-9f6735b2714e-kube-api-access-5vxd9\") pod \"auto-csr-approver-29552232-d429x\" (UID: \"bbf9a36e-c0e2-4943-a87c-9f6735b2714e\") " pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.447930 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.802741 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552232-d429x"] Mar 10 09:12:01 crc kubenswrapper[4883]: I0310 09:12:01.672747 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552232-d429x" event={"ID":"bbf9a36e-c0e2-4943-a87c-9f6735b2714e","Type":"ContainerStarted","Data":"38121c79b8ef164512014cedd9718eef5e4d30811a6036a632888659585fca79"} Mar 10 09:12:02 crc kubenswrapper[4883]: I0310 09:12:02.680618 4883 generic.go:334] "Generic (PLEG): container finished" podID="bbf9a36e-c0e2-4943-a87c-9f6735b2714e" containerID="49c1aa583870be3098bda47d15de71e40f64a8b97a906132b01f7c81a5eefc00" exitCode=0 Mar 10 09:12:02 crc kubenswrapper[4883]: I0310 09:12:02.680708 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552232-d429x" event={"ID":"bbf9a36e-c0e2-4943-a87c-9f6735b2714e","Type":"ContainerDied","Data":"49c1aa583870be3098bda47d15de71e40f64a8b97a906132b01f7c81a5eefc00"} Mar 10 09:12:03 crc kubenswrapper[4883]: I0310 09:12:03.866546 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:03 crc kubenswrapper[4883]: I0310 09:12:03.973101 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vxd9\" (UniqueName: \"kubernetes.io/projected/bbf9a36e-c0e2-4943-a87c-9f6735b2714e-kube-api-access-5vxd9\") pod \"bbf9a36e-c0e2-4943-a87c-9f6735b2714e\" (UID: \"bbf9a36e-c0e2-4943-a87c-9f6735b2714e\") " Mar 10 09:12:03 crc kubenswrapper[4883]: I0310 09:12:03.978893 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf9a36e-c0e2-4943-a87c-9f6735b2714e-kube-api-access-5vxd9" (OuterVolumeSpecName: "kube-api-access-5vxd9") pod "bbf9a36e-c0e2-4943-a87c-9f6735b2714e" (UID: "bbf9a36e-c0e2-4943-a87c-9f6735b2714e"). InnerVolumeSpecName "kube-api-access-5vxd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:12:04 crc kubenswrapper[4883]: I0310 09:12:04.074682 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vxd9\" (UniqueName: \"kubernetes.io/projected/bbf9a36e-c0e2-4943-a87c-9f6735b2714e-kube-api-access-5vxd9\") on node \"crc\" DevicePath \"\"" Mar 10 09:12:04 crc kubenswrapper[4883]: I0310 09:12:04.694856 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552232-d429x" event={"ID":"bbf9a36e-c0e2-4943-a87c-9f6735b2714e","Type":"ContainerDied","Data":"38121c79b8ef164512014cedd9718eef5e4d30811a6036a632888659585fca79"} Mar 10 09:12:04 crc kubenswrapper[4883]: I0310 09:12:04.694932 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38121c79b8ef164512014cedd9718eef5e4d30811a6036a632888659585fca79" Mar 10 09:12:04 crc kubenswrapper[4883]: I0310 09:12:04.694955 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:04 crc kubenswrapper[4883]: I0310 09:12:04.914976 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552226-jp7d9"] Mar 10 09:12:04 crc kubenswrapper[4883]: I0310 09:12:04.919168 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552226-jp7d9"] Mar 10 09:12:06 crc kubenswrapper[4883]: I0310 09:12:06.085081 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="632d4971-be4e-4939-a46a-42604b182436" path="/var/lib/kubelet/pods/632d4971-be4e-4939-a46a-42604b182436/volumes" Mar 10 09:13:17 crc kubenswrapper[4883]: I0310 09:13:17.449468 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:13:17 crc kubenswrapper[4883]: I0310 09:13:17.449909 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:13:34 crc kubenswrapper[4883]: I0310 09:13:34.294052 4883 scope.go:117] "RemoveContainer" containerID="4f14915d4656a1c4b614a36f6c062aeb515069a6fa76ae902ddf80e353fe48d5" Mar 10 09:13:47 crc kubenswrapper[4883]: I0310 09:13:47.448963 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:13:47 crc kubenswrapper[4883]: I0310 09:13:47.449297 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.124178 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552234-ftnh5"] Mar 10 09:14:00 crc kubenswrapper[4883]: E0310 09:14:00.124783 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf9a36e-c0e2-4943-a87c-9f6735b2714e" containerName="oc" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.124796 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf9a36e-c0e2-4943-a87c-9f6735b2714e" containerName="oc" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.124923 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf9a36e-c0e2-4943-a87c-9f6735b2714e" containerName="oc" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.125300 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.126873 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.126919 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.127001 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.129044 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552234-ftnh5"] Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.263462 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48gqm\" (UniqueName: \"kubernetes.io/projected/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9-kube-api-access-48gqm\") pod \"auto-csr-approver-29552234-ftnh5\" (UID: \"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9\") " pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.364587 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48gqm\" (UniqueName: \"kubernetes.io/projected/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9-kube-api-access-48gqm\") pod \"auto-csr-approver-29552234-ftnh5\" (UID: \"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9\") " pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.379295 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48gqm\" (UniqueName: \"kubernetes.io/projected/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9-kube-api-access-48gqm\") pod \"auto-csr-approver-29552234-ftnh5\" (UID: \"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9\") " pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.441402 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.771610 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552234-ftnh5"] Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.778115 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:14:01 crc kubenswrapper[4883]: I0310 09:14:01.277847 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" event={"ID":"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9","Type":"ContainerStarted","Data":"8528f2157f0acd32328646f1dce7be45a561faceb1ed76a0c35c69cc8d4bd560"} Mar 10 09:14:02 crc kubenswrapper[4883]: I0310 09:14:02.288077 4883 generic.go:334] "Generic (PLEG): container finished" podID="80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9" containerID="3e7da8f0c03e771b080917bc83392de1ddb5243f6ec147ddb91205eab0cfd88f" exitCode=0 Mar 10 09:14:02 crc kubenswrapper[4883]: I0310 09:14:02.288286 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" event={"ID":"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9","Type":"ContainerDied","Data":"3e7da8f0c03e771b080917bc83392de1ddb5243f6ec147ddb91205eab0cfd88f"} Mar 10 09:14:03 crc kubenswrapper[4883]: I0310 09:14:03.444807 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:03 crc kubenswrapper[4883]: I0310 09:14:03.598718 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48gqm\" (UniqueName: \"kubernetes.io/projected/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9-kube-api-access-48gqm\") pod \"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9\" (UID: \"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9\") " Mar 10 09:14:03 crc kubenswrapper[4883]: I0310 09:14:03.603722 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9-kube-api-access-48gqm" (OuterVolumeSpecName: "kube-api-access-48gqm") pod "80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9" (UID: "80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9"). InnerVolumeSpecName "kube-api-access-48gqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:14:03 crc kubenswrapper[4883]: I0310 09:14:03.699940 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48gqm\" (UniqueName: \"kubernetes.io/projected/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9-kube-api-access-48gqm\") on node \"crc\" DevicePath \"\"" Mar 10 09:14:04 crc kubenswrapper[4883]: I0310 09:14:04.298651 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" event={"ID":"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9","Type":"ContainerDied","Data":"8528f2157f0acd32328646f1dce7be45a561faceb1ed76a0c35c69cc8d4bd560"} Mar 10 09:14:04 crc kubenswrapper[4883]: I0310 09:14:04.298693 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8528f2157f0acd32328646f1dce7be45a561faceb1ed76a0c35c69cc8d4bd560" Mar 10 09:14:04 crc kubenswrapper[4883]: I0310 09:14:04.298752 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:04 crc kubenswrapper[4883]: I0310 09:14:04.485365 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552228-kn7mm"] Mar 10 09:14:04 crc kubenswrapper[4883]: I0310 09:14:04.490800 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552228-kn7mm"] Mar 10 09:14:06 crc kubenswrapper[4883]: I0310 09:14:06.085272 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c07643a3-c0a9-4770-a08e-ab4fb32dfe8e" path="/var/lib/kubelet/pods/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e/volumes" Mar 10 09:14:17 crc kubenswrapper[4883]: I0310 09:14:17.449522 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:14:17 crc kubenswrapper[4883]: I0310 09:14:17.449729 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:14:17 crc kubenswrapper[4883]: I0310 09:14:17.449765 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:14:17 crc kubenswrapper[4883]: I0310 09:14:17.450156 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa3cbb52c5196a50f8eb140640d5ec9255382224cac25f2663c634e049543e13"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:14:17 crc kubenswrapper[4883]: I0310 09:14:17.450213 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://fa3cbb52c5196a50f8eb140640d5ec9255382224cac25f2663c634e049543e13" gracePeriod=600 Mar 10 09:14:18 crc kubenswrapper[4883]: I0310 09:14:18.365270 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="fa3cbb52c5196a50f8eb140640d5ec9255382224cac25f2663c634e049543e13" exitCode=0 Mar 10 09:14:18 crc kubenswrapper[4883]: I0310 09:14:18.365335 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"fa3cbb52c5196a50f8eb140640d5ec9255382224cac25f2663c634e049543e13"} Mar 10 09:14:18 crc kubenswrapper[4883]: I0310 09:14:18.365698 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"263ff00fc2241dc1e88d637d22f488b48ffd25d14a523d99a3849ab10808063c"} Mar 10 09:14:18 crc kubenswrapper[4883]: I0310 09:14:18.365724 4883 scope.go:117] "RemoveContainer" containerID="dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a" Mar 10 09:14:34 crc kubenswrapper[4883]: I0310 09:14:34.336200 4883 scope.go:117] "RemoveContainer" containerID="8d2862eee27c865a5680228f73b67899d38c264111c25020319e7ec39c7a9c80" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.129182 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67"] Mar 10 09:15:00 crc kubenswrapper[4883]: E0310 09:15:00.130205 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9" containerName="oc" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.130228 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9" containerName="oc" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.130344 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9" containerName="oc" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.130868 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.134118 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.134249 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.140210 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67"] Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.236302 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-secret-volume\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.236358 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-config-volume\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.236428 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn6jb\" (UniqueName: \"kubernetes.io/projected/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-kube-api-access-bn6jb\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.337843 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-config-volume\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.337926 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn6jb\" (UniqueName: \"kubernetes.io/projected/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-kube-api-access-bn6jb\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.337994 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-secret-volume\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.338927 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-config-volume\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.343791 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-secret-volume\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.353442 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn6jb\" (UniqueName: \"kubernetes.io/projected/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-kube-api-access-bn6jb\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.446341 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.598546 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67"] Mar 10 09:15:01 crc kubenswrapper[4883]: I0310 09:15:01.563103 4883 generic.go:334] "Generic (PLEG): container finished" podID="e16ab2a6-c8ca-4487-b42f-381f61d18ba0" containerID="1cb9093a5dc1551f7fb85ef25abe36d1ab423453387c5dcc49644004e7492e56" exitCode=0 Mar 10 09:15:01 crc kubenswrapper[4883]: I0310 09:15:01.563215 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" event={"ID":"e16ab2a6-c8ca-4487-b42f-381f61d18ba0","Type":"ContainerDied","Data":"1cb9093a5dc1551f7fb85ef25abe36d1ab423453387c5dcc49644004e7492e56"} Mar 10 09:15:01 crc kubenswrapper[4883]: I0310 09:15:01.563526 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" event={"ID":"e16ab2a6-c8ca-4487-b42f-381f61d18ba0","Type":"ContainerStarted","Data":"045f88345a3bdabd64b98dc407d9fd728b754e1365158cfc5d7267e108746abd"} Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.737395 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.868709 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-config-volume\") pod \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.868888 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-secret-volume\") pod \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.868926 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn6jb\" (UniqueName: \"kubernetes.io/projected/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-kube-api-access-bn6jb\") pod \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.869577 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-config-volume" (OuterVolumeSpecName: "config-volume") pod "e16ab2a6-c8ca-4487-b42f-381f61d18ba0" (UID: "e16ab2a6-c8ca-4487-b42f-381f61d18ba0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.874689 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-kube-api-access-bn6jb" (OuterVolumeSpecName: "kube-api-access-bn6jb") pod "e16ab2a6-c8ca-4487-b42f-381f61d18ba0" (UID: "e16ab2a6-c8ca-4487-b42f-381f61d18ba0"). InnerVolumeSpecName "kube-api-access-bn6jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.874731 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e16ab2a6-c8ca-4487-b42f-381f61d18ba0" (UID: "e16ab2a6-c8ca-4487-b42f-381f61d18ba0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.970495 4883 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.970545 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn6jb\" (UniqueName: \"kubernetes.io/projected/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-kube-api-access-bn6jb\") on node \"crc\" DevicePath \"\"" Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.970557 4883 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:15:03 crc kubenswrapper[4883]: I0310 09:15:03.580382 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" event={"ID":"e16ab2a6-c8ca-4487-b42f-381f61d18ba0","Type":"ContainerDied","Data":"045f88345a3bdabd64b98dc407d9fd728b754e1365158cfc5d7267e108746abd"} Mar 10 09:15:03 crc kubenswrapper[4883]: I0310 09:15:03.580711 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="045f88345a3bdabd64b98dc407d9fd728b754e1365158cfc5d7267e108746abd" Mar 10 09:15:03 crc kubenswrapper[4883]: I0310 09:15:03.580450 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.103635 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x"] Mar 10 09:15:53 crc kubenswrapper[4883]: E0310 09:15:53.104389 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16ab2a6-c8ca-4487-b42f-381f61d18ba0" containerName="collect-profiles" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.104404 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16ab2a6-c8ca-4487-b42f-381f61d18ba0" containerName="collect-profiles" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.104510 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16ab2a6-c8ca-4487-b42f-381f61d18ba0" containerName="collect-profiles" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.104860 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.106416 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.106639 4883 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-jkcv6" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.106861 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.107460 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-kl2rd"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.108114 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kl2rd" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.111025 4883 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-qw25n" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.117154 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dfhh4"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.117789 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.120070 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.120321 4883 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vstwv" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.130589 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-kl2rd"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.142964 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dfhh4"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.292609 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wrck\" (UniqueName: \"kubernetes.io/projected/1c0c9250-e9df-4898-bd0e-91919353a3f6-kube-api-access-8wrck\") pod \"cert-manager-858654f9db-kl2rd\" (UID: \"1c0c9250-e9df-4898-bd0e-91919353a3f6\") " pod="cert-manager/cert-manager-858654f9db-kl2rd" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.292758 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdtm7\" (UniqueName: \"kubernetes.io/projected/f33cf1b9-ce0d-41f4-8f36-1b159badc41e-kube-api-access-pdtm7\") pod \"cert-manager-webhook-687f57d79b-dfhh4\" (UID: \"f33cf1b9-ce0d-41f4-8f36-1b159badc41e\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.292913 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbpfv\" (UniqueName: \"kubernetes.io/projected/b92cb5d0-214a-49a6-b9b7-f210fef36956-kube-api-access-kbpfv\") pod \"cert-manager-cainjector-cf98fcc89-n2g9x\" (UID: \"b92cb5d0-214a-49a6-b9b7-f210fef36956\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.394226 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdtm7\" (UniqueName: \"kubernetes.io/projected/f33cf1b9-ce0d-41f4-8f36-1b159badc41e-kube-api-access-pdtm7\") pod \"cert-manager-webhook-687f57d79b-dfhh4\" (UID: \"f33cf1b9-ce0d-41f4-8f36-1b159badc41e\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.394301 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbpfv\" (UniqueName: \"kubernetes.io/projected/b92cb5d0-214a-49a6-b9b7-f210fef36956-kube-api-access-kbpfv\") pod \"cert-manager-cainjector-cf98fcc89-n2g9x\" (UID: \"b92cb5d0-214a-49a6-b9b7-f210fef36956\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.394370 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wrck\" (UniqueName: \"kubernetes.io/projected/1c0c9250-e9df-4898-bd0e-91919353a3f6-kube-api-access-8wrck\") pod \"cert-manager-858654f9db-kl2rd\" (UID: \"1c0c9250-e9df-4898-bd0e-91919353a3f6\") " pod="cert-manager/cert-manager-858654f9db-kl2rd" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.412521 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdtm7\" (UniqueName: \"kubernetes.io/projected/f33cf1b9-ce0d-41f4-8f36-1b159badc41e-kube-api-access-pdtm7\") pod \"cert-manager-webhook-687f57d79b-dfhh4\" (UID: \"f33cf1b9-ce0d-41f4-8f36-1b159badc41e\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.413172 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wrck\" (UniqueName: \"kubernetes.io/projected/1c0c9250-e9df-4898-bd0e-91919353a3f6-kube-api-access-8wrck\") pod \"cert-manager-858654f9db-kl2rd\" (UID: \"1c0c9250-e9df-4898-bd0e-91919353a3f6\") " pod="cert-manager/cert-manager-858654f9db-kl2rd" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.413602 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbpfv\" (UniqueName: \"kubernetes.io/projected/b92cb5d0-214a-49a6-b9b7-f210fef36956-kube-api-access-kbpfv\") pod \"cert-manager-cainjector-cf98fcc89-n2g9x\" (UID: \"b92cb5d0-214a-49a6-b9b7-f210fef36956\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.431053 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.435836 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kl2rd" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.452725 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.803655 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-kl2rd"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.828186 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-kl2rd" event={"ID":"1c0c9250-e9df-4898-bd0e-91919353a3f6","Type":"ContainerStarted","Data":"da4150833c09bc36a85adf0d73a58273a6d0cb80e6fb6f941d39fb274363c625"} Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.860272 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.863235 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dfhh4"] Mar 10 09:15:53 crc kubenswrapper[4883]: W0310 09:15:53.870540 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92cb5d0_214a_49a6_b9b7_f210fef36956.slice/crio-0677f78f55af282e718229d75064158e8f521e4c0f5b72cc42c24b612ba336dc WatchSource:0}: Error finding container 0677f78f55af282e718229d75064158e8f521e4c0f5b72cc42c24b612ba336dc: Status 404 returned error can't find the container with id 0677f78f55af282e718229d75064158e8f521e4c0f5b72cc42c24b612ba336dc Mar 10 09:15:53 crc kubenswrapper[4883]: W0310 09:15:53.877144 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf33cf1b9_ce0d_41f4_8f36_1b159badc41e.slice/crio-75f4248a44d760e9938375529d3567bc2fa003942302e375d40b588cc724aae6 WatchSource:0}: Error finding container 75f4248a44d760e9938375529d3567bc2fa003942302e375d40b588cc724aae6: Status 404 returned error can't find the container with id 75f4248a44d760e9938375529d3567bc2fa003942302e375d40b588cc724aae6 Mar 10 09:15:54 crc kubenswrapper[4883]: I0310 09:15:54.837097 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" event={"ID":"b92cb5d0-214a-49a6-b9b7-f210fef36956","Type":"ContainerStarted","Data":"0677f78f55af282e718229d75064158e8f521e4c0f5b72cc42c24b612ba336dc"} Mar 10 09:15:54 crc kubenswrapper[4883]: I0310 09:15:54.839152 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" event={"ID":"f33cf1b9-ce0d-41f4-8f36-1b159badc41e","Type":"ContainerStarted","Data":"75f4248a44d760e9938375529d3567bc2fa003942302e375d40b588cc724aae6"} Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.857783 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" event={"ID":"f33cf1b9-ce0d-41f4-8f36-1b159badc41e","Type":"ContainerStarted","Data":"769d2ee506b4abc0cac7ad309291e5fbe8da836ead0f52d1f5554cc66357937d"} Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.858653 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.861177 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" event={"ID":"b92cb5d0-214a-49a6-b9b7-f210fef36956","Type":"ContainerStarted","Data":"8b993c6c5ec7a3d0bb2e90d797bbba40df33b5da3a486c51533ace44b82d27f2"} Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.863353 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-kl2rd" event={"ID":"1c0c9250-e9df-4898-bd0e-91919353a3f6","Type":"ContainerStarted","Data":"75027be737cadc92b4bd3b4acd45f4ab16f2ec61d887b30a54d742f4c639e91a"} Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.878466 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" podStartSLOduration=1.9734419779999999 podStartE2EDuration="4.878435299s" podCreationTimestamp="2026-03-10 09:15:53 +0000 UTC" firstStartedPulling="2026-03-10 09:15:53.879309652 +0000 UTC m=+740.134207541" lastFinishedPulling="2026-03-10 09:15:56.784302963 +0000 UTC m=+743.039200862" observedRunningTime="2026-03-10 09:15:57.872489788 +0000 UTC m=+744.127387677" watchObservedRunningTime="2026-03-10 09:15:57.878435299 +0000 UTC m=+744.133333188" Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.889597 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" podStartSLOduration=2.00391127 podStartE2EDuration="4.889589111s" podCreationTimestamp="2026-03-10 09:15:53 +0000 UTC" firstStartedPulling="2026-03-10 09:15:53.879336362 +0000 UTC m=+740.134234251" lastFinishedPulling="2026-03-10 09:15:56.765014213 +0000 UTC m=+743.019912092" observedRunningTime="2026-03-10 09:15:57.888163192 +0000 UTC m=+744.143061080" watchObservedRunningTime="2026-03-10 09:15:57.889589111 +0000 UTC m=+744.144487000" Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.902657 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-kl2rd" podStartSLOduration=1.9541161790000001 podStartE2EDuration="4.902632065s" podCreationTimestamp="2026-03-10 09:15:53 +0000 UTC" firstStartedPulling="2026-03-10 09:15:53.810916541 +0000 UTC m=+740.065814430" lastFinishedPulling="2026-03-10 09:15:56.759432427 +0000 UTC m=+743.014330316" observedRunningTime="2026-03-10 09:15:57.90190361 +0000 UTC m=+744.156801500" watchObservedRunningTime="2026-03-10 09:15:57.902632065 +0000 UTC m=+744.157529954" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.154679 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552236-hdd6d"] Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.155890 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.158638 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.158639 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.168489 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.170766 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552236-hdd6d"] Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.278147 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxthb\" (UniqueName: \"kubernetes.io/projected/a2a502d2-d219-4f01-aebc-f27fb7766458-kube-api-access-wxthb\") pod \"auto-csr-approver-29552236-hdd6d\" (UID: \"a2a502d2-d219-4f01-aebc-f27fb7766458\") " pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.379594 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxthb\" (UniqueName: \"kubernetes.io/projected/a2a502d2-d219-4f01-aebc-f27fb7766458-kube-api-access-wxthb\") pod \"auto-csr-approver-29552236-hdd6d\" (UID: \"a2a502d2-d219-4f01-aebc-f27fb7766458\") " pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.401268 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxthb\" (UniqueName: \"kubernetes.io/projected/a2a502d2-d219-4f01-aebc-f27fb7766458-kube-api-access-wxthb\") pod \"auto-csr-approver-29552236-hdd6d\" (UID: \"a2a502d2-d219-4f01-aebc-f27fb7766458\") " pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.471051 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.845133 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552236-hdd6d"] Mar 10 09:16:00 crc kubenswrapper[4883]: W0310 09:16:00.850685 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2a502d2_d219_4f01_aebc_f27fb7766458.slice/crio-266194f6cd77ef22f44bcb6d3a70c2b16552838159f0545d6f2a27f468893ce6 WatchSource:0}: Error finding container 266194f6cd77ef22f44bcb6d3a70c2b16552838159f0545d6f2a27f468893ce6: Status 404 returned error can't find the container with id 266194f6cd77ef22f44bcb6d3a70c2b16552838159f0545d6f2a27f468893ce6 Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.878023 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" event={"ID":"a2a502d2-d219-4f01-aebc-f27fb7766458","Type":"ContainerStarted","Data":"266194f6cd77ef22f44bcb6d3a70c2b16552838159f0545d6f2a27f468893ce6"} Mar 10 09:16:02 crc kubenswrapper[4883]: I0310 09:16:02.891907 4883 generic.go:334] "Generic (PLEG): container finished" podID="a2a502d2-d219-4f01-aebc-f27fb7766458" containerID="3c695c439ea9e2ad9e771b20e1905dadd59374ebe052ec433b69ea1e82161c99" exitCode=0 Mar 10 09:16:02 crc kubenswrapper[4883]: I0310 09:16:02.892055 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" event={"ID":"a2a502d2-d219-4f01-aebc-f27fb7766458","Type":"ContainerDied","Data":"3c695c439ea9e2ad9e771b20e1905dadd59374ebe052ec433b69ea1e82161c99"} Mar 10 09:16:03 crc kubenswrapper[4883]: I0310 09:16:03.455779 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.099675 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.224946 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxthb\" (UniqueName: \"kubernetes.io/projected/a2a502d2-d219-4f01-aebc-f27fb7766458-kube-api-access-wxthb\") pod \"a2a502d2-d219-4f01-aebc-f27fb7766458\" (UID: \"a2a502d2-d219-4f01-aebc-f27fb7766458\") " Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.230003 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2a502d2-d219-4f01-aebc-f27fb7766458-kube-api-access-wxthb" (OuterVolumeSpecName: "kube-api-access-wxthb") pod "a2a502d2-d219-4f01-aebc-f27fb7766458" (UID: "a2a502d2-d219-4f01-aebc-f27fb7766458"). InnerVolumeSpecName "kube-api-access-wxthb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.325813 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxthb\" (UniqueName: \"kubernetes.io/projected/a2a502d2-d219-4f01-aebc-f27fb7766458-kube-api-access-wxthb\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.904041 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" event={"ID":"a2a502d2-d219-4f01-aebc-f27fb7766458","Type":"ContainerDied","Data":"266194f6cd77ef22f44bcb6d3a70c2b16552838159f0545d6f2a27f468893ce6"} Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.904398 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="266194f6cd77ef22f44bcb6d3a70c2b16552838159f0545d6f2a27f468893ce6" Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.904107 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:05 crc kubenswrapper[4883]: I0310 09:16:05.142533 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552230-7n2zm"] Mar 10 09:16:05 crc kubenswrapper[4883]: I0310 09:16:05.144275 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552230-7n2zm"] Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.005720 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pzdml"] Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006356 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-controller" containerID="cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006514 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="northd" containerID="cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006416 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="nbdb" containerID="cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006630 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-node" containerID="cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006601 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-acl-logging" containerID="cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006667 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="sbdb" containerID="cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006512 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.042423 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" containerID="cri-o://afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.085189 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e54057e-1436-403c-bd92-66dbb888b129" path="/var/lib/kubelet/pods/9e54057e-1436-403c-bd92-66dbb888b129/volumes" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.267865 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/3.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.270850 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovn-acl-logging/0.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.271511 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovn-controller/0.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.272034 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.321846 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xbrjj"] Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322154 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322173 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322182 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="nbdb" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322190 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="nbdb" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322198 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kubecfg-setup" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322204 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kubecfg-setup" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322212 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322218 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322230 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="northd" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322236 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="northd" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322242 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2a502d2-d219-4f01-aebc-f27fb7766458" containerName="oc" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322247 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2a502d2-d219-4f01-aebc-f27fb7766458" containerName="oc" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322258 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322263 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322271 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="sbdb" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322276 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="sbdb" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322285 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-acl-logging" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322290 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-acl-logging" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322299 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322305 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322312 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322318 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322326 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-node" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322332 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-node" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322345 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322351 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322496 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322507 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322518 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-node" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322528 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322538 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322546 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="northd" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322554 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322561 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322567 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2a502d2-d219-4f01-aebc-f27fb7766458" containerName="oc" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322574 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-acl-logging" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322582 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="sbdb" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322590 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="nbdb" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322709 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322716 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322835 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.324547 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348348 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348401 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348461 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-systemd\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348502 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovn-node-metrics-cert\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348524 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-etc-openvswitch\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348622 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-kubelet\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348648 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovn-node-metrics-cert\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348665 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348687 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-log-socket\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348723 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-ovn\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348742 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-etc-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348761 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348787 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovnkube-script-lib\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348804 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-systemd-units\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348819 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-var-lib-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348836 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-env-overrides\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348860 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-node-log\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348877 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-systemd\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348893 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovnkube-config\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348913 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-cni-netd\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348934 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8llmz\" (UniqueName: \"kubernetes.io/projected/a1a4989a-9fbe-41de-be68-4377681f9fd6-kube-api-access-8llmz\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348956 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-run-netns\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348970 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-cni-bin\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348989 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-run-ovn-kubernetes\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.349013 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-slash\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.349050 4883 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.349606 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.355753 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.363587 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449769 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-env-overrides\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449806 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-bin\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449837 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-node-log\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449889 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-script-lib\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449897 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449912 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-ovn\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449936 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-node-log" (OuterVolumeSpecName: "node-log") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449945 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h98t5\" (UniqueName: \"kubernetes.io/projected/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-kube-api-access-h98t5\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449961 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449974 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-openvswitch\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450011 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-slash\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450031 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-kubelet\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450050 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-log-socket\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450073 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-netns\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450106 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-config\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450138 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-systemd-units\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450114 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450157 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-var-lib-openvswitch\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450203 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-netd\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450217 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-log-socket" (OuterVolumeSpecName: "log-socket") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450227 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-ovn-kubernetes\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450247 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450272 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450255 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-slash" (OuterVolumeSpecName: "host-slash") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450328 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450351 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450346 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450371 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450443 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-kubelet\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450508 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450535 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovn-node-metrics-cert\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450551 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-kubelet\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450575 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-log-socket\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450619 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-log-socket\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450641 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-ovn\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450639 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450652 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450664 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-etc-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450688 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-etc-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450714 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450754 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450816 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-ovn\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450820 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450961 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450979 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovnkube-script-lib\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451366 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-var-lib-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451447 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-var-lib-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451518 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-systemd-units\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451557 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-env-overrides\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451603 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-systemd-units\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451701 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-node-log\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451756 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-systemd\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451789 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovnkube-config\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451809 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-node-log\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451860 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovnkube-script-lib\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451884 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-cni-netd\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451914 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-systemd\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451951 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-cni-netd\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452025 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8llmz\" (UniqueName: \"kubernetes.io/projected/a1a4989a-9fbe-41de-be68-4377681f9fd6-kube-api-access-8llmz\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452071 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-run-netns\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452110 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-cni-bin\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452144 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-run-ovn-kubernetes\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452229 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-env-overrides\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452258 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-slash\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452232 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-slash\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452292 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-run-netns\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452329 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-cni-bin\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452449 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-run-ovn-kubernetes\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452514 4883 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452535 4883 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452549 4883 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452564 4883 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452576 4883 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-slash\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452587 4883 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-log-socket\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452599 4883 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452613 4883 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452625 4883 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452636 4883 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452646 4883 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452646 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovnkube-config\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452657 4883 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452702 4883 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452716 4883 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452735 4883 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452751 4883 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452760 4883 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452772 4883 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-node-log\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.454040 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovn-node-metrics-cert\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.454051 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-kube-api-access-h98t5" (OuterVolumeSpecName: "kube-api-access-h98t5") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "kube-api-access-h98t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.467459 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8llmz\" (UniqueName: \"kubernetes.io/projected/a1a4989a-9fbe-41de-be68-4377681f9fd6-kube-api-access-8llmz\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.554343 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h98t5\" (UniqueName: \"kubernetes.io/projected/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-kube-api-access-h98t5\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.638033 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: W0310 09:16:06.659597 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a4989a_9fbe_41de_be68_4377681f9fd6.slice/crio-475407723f6183c9255a0a3acf38f084fe69309ffac7a14a4e73d2dfd3c5ac0c WatchSource:0}: Error finding container 475407723f6183c9255a0a3acf38f084fe69309ffac7a14a4e73d2dfd3c5ac0c: Status 404 returned error can't find the container with id 475407723f6183c9255a0a3acf38f084fe69309ffac7a14a4e73d2dfd3c5ac0c Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.918268 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/3.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.921196 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovn-acl-logging/0.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.921763 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovn-controller/0.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922216 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922246 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922254 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922263 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922271 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922277 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922287 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" exitCode=143 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922295 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" exitCode=143 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922308 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922337 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922358 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922375 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922387 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922403 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922418 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922431 4883 scope.go:117] "RemoveContainer" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922431 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922555 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922564 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922571 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922578 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922584 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922591 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922598 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922604 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922614 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922626 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922633 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922639 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922644 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922650 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922656 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922663 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922670 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922677 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922683 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922692 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922703 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922709 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922716 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922723 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922731 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922737 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922743 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922748 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922754 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922758 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922766 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"f3b20c822aee97242cfd38a934b97331876c32d69d19440940706ed8c35c887b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922775 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922781 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922786 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922791 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922796 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922802 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922807 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922814 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922819 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922825 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.925797 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/2.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.926297 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/1.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.926330 4883 generic.go:334] "Generic (PLEG): container finished" podID="8e883c29-520e-4b1f-b49c-3df10450d467" containerID="5ff25fea28b07754fa577f6bb631a6166978ad82f449fba3657e3971a580bd6d" exitCode=2 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.926399 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerDied","Data":"5ff25fea28b07754fa577f6bb631a6166978ad82f449fba3657e3971a580bd6d"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.926441 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.927429 4883 scope.go:117] "RemoveContainer" containerID="5ff25fea28b07754fa577f6bb631a6166978ad82f449fba3657e3971a580bd6d" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.928825 4883 generic.go:334] "Generic (PLEG): container finished" podID="a1a4989a-9fbe-41de-be68-4377681f9fd6" containerID="e8f9377905675bb42c0dd73b121c12a4cf939452f080a77a60656a017a2c06e0" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.928882 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerDied","Data":"e8f9377905675bb42c0dd73b121c12a4cf939452f080a77a60656a017a2c06e0"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.928937 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"475407723f6183c9255a0a3acf38f084fe69309ffac7a14a4e73d2dfd3c5ac0c"} Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.929427 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-p898z_openshift-multus(8e883c29-520e-4b1f-b49c-3df10450d467)\"" pod="openshift-multus/multus-p898z" podUID="8e883c29-520e-4b1f-b49c-3df10450d467" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.951575 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.958021 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pzdml"] Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.960697 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pzdml"] Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.968238 4883 scope.go:117] "RemoveContainer" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.980807 4883 scope.go:117] "RemoveContainer" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.992440 4883 scope.go:117] "RemoveContainer" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.015721 4883 scope.go:117] "RemoveContainer" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.030471 4883 scope.go:117] "RemoveContainer" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.044101 4883 scope.go:117] "RemoveContainer" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.058967 4883 scope.go:117] "RemoveContainer" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.078546 4883 scope.go:117] "RemoveContainer" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.092555 4883 scope.go:117] "RemoveContainer" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.092839 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": container with ID starting with afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961 not found: ID does not exist" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.092880 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} err="failed to get container status \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": rpc error: code = NotFound desc = could not find container \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": container with ID starting with afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.092919 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.093249 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": container with ID starting with 156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a not found: ID does not exist" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.093279 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} err="failed to get container status \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": rpc error: code = NotFound desc = could not find container \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": container with ID starting with 156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.093303 4883 scope.go:117] "RemoveContainer" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.093647 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": container with ID starting with 33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651 not found: ID does not exist" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.093673 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} err="failed to get container status \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": rpc error: code = NotFound desc = could not find container \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": container with ID starting with 33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.093688 4883 scope.go:117] "RemoveContainer" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.094008 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": container with ID starting with d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b not found: ID does not exist" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.094030 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} err="failed to get container status \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": rpc error: code = NotFound desc = could not find container \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": container with ID starting with d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.094043 4883 scope.go:117] "RemoveContainer" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.094377 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": container with ID starting with 08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7 not found: ID does not exist" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.094402 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} err="failed to get container status \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": rpc error: code = NotFound desc = could not find container \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": container with ID starting with 08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.094420 4883 scope.go:117] "RemoveContainer" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.094734 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": container with ID starting with 7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5 not found: ID does not exist" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.094755 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} err="failed to get container status \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": rpc error: code = NotFound desc = could not find container \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": container with ID starting with 7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.094809 4883 scope.go:117] "RemoveContainer" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.095061 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": container with ID starting with 836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b not found: ID does not exist" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.095082 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} err="failed to get container status \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": rpc error: code = NotFound desc = could not find container \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": container with ID starting with 836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.095104 4883 scope.go:117] "RemoveContainer" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.095374 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": container with ID starting with 39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef not found: ID does not exist" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.095400 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} err="failed to get container status \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": rpc error: code = NotFound desc = could not find container \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": container with ID starting with 39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.095415 4883 scope.go:117] "RemoveContainer" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.095703 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": container with ID starting with 548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531 not found: ID does not exist" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.095745 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} err="failed to get container status \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": rpc error: code = NotFound desc = could not find container \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": container with ID starting with 548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.095759 4883 scope.go:117] "RemoveContainer" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.096090 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": container with ID starting with 131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca not found: ID does not exist" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096117 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} err="failed to get container status \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": rpc error: code = NotFound desc = could not find container \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": container with ID starting with 131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096134 4883 scope.go:117] "RemoveContainer" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096396 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} err="failed to get container status \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": rpc error: code = NotFound desc = could not find container \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": container with ID starting with afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096419 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096674 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} err="failed to get container status \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": rpc error: code = NotFound desc = could not find container \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": container with ID starting with 156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096692 4883 scope.go:117] "RemoveContainer" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096947 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} err="failed to get container status \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": rpc error: code = NotFound desc = could not find container \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": container with ID starting with 33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096970 4883 scope.go:117] "RemoveContainer" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097225 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} err="failed to get container status \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": rpc error: code = NotFound desc = could not find container \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": container with ID starting with d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097242 4883 scope.go:117] "RemoveContainer" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097461 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} err="failed to get container status \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": rpc error: code = NotFound desc = could not find container \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": container with ID starting with 08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097493 4883 scope.go:117] "RemoveContainer" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097740 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} err="failed to get container status \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": rpc error: code = NotFound desc = could not find container \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": container with ID starting with 7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097758 4883 scope.go:117] "RemoveContainer" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097971 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} err="failed to get container status \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": rpc error: code = NotFound desc = could not find container \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": container with ID starting with 836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097987 4883 scope.go:117] "RemoveContainer" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098235 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} err="failed to get container status \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": rpc error: code = NotFound desc = could not find container \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": container with ID starting with 39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098254 4883 scope.go:117] "RemoveContainer" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098498 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} err="failed to get container status \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": rpc error: code = NotFound desc = could not find container \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": container with ID starting with 548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098519 4883 scope.go:117] "RemoveContainer" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098746 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} err="failed to get container status \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": rpc error: code = NotFound desc = could not find container \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": container with ID starting with 131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098763 4883 scope.go:117] "RemoveContainer" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098997 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} err="failed to get container status \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": rpc error: code = NotFound desc = could not find container \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": container with ID starting with afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.099022 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.099301 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} err="failed to get container status \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": rpc error: code = NotFound desc = could not find container \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": container with ID starting with 156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.099333 4883 scope.go:117] "RemoveContainer" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.099589 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} err="failed to get container status \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": rpc error: code = NotFound desc = could not find container \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": container with ID starting with 33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.099608 4883 scope.go:117] "RemoveContainer" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100051 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} err="failed to get container status \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": rpc error: code = NotFound desc = could not find container \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": container with ID starting with d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100072 4883 scope.go:117] "RemoveContainer" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100352 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} err="failed to get container status \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": rpc error: code = NotFound desc = could not find container \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": container with ID starting with 08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100371 4883 scope.go:117] "RemoveContainer" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100663 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} err="failed to get container status \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": rpc error: code = NotFound desc = could not find container \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": container with ID starting with 7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100685 4883 scope.go:117] "RemoveContainer" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100940 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} err="failed to get container status \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": rpc error: code = NotFound desc = could not find container \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": container with ID starting with 836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100958 4883 scope.go:117] "RemoveContainer" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101229 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} err="failed to get container status \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": rpc error: code = NotFound desc = could not find container \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": container with ID starting with 39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101247 4883 scope.go:117] "RemoveContainer" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101439 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} err="failed to get container status \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": rpc error: code = NotFound desc = could not find container \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": container with ID starting with 548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101456 4883 scope.go:117] "RemoveContainer" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101681 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} err="failed to get container status \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": rpc error: code = NotFound desc = could not find container \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": container with ID starting with 131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101703 4883 scope.go:117] "RemoveContainer" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101975 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} err="failed to get container status \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": rpc error: code = NotFound desc = could not find container \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": container with ID starting with afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101997 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102220 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} err="failed to get container status \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": rpc error: code = NotFound desc = could not find container \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": container with ID starting with 156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102239 4883 scope.go:117] "RemoveContainer" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102516 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} err="failed to get container status \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": rpc error: code = NotFound desc = could not find container \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": container with ID starting with 33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102540 4883 scope.go:117] "RemoveContainer" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102739 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} err="failed to get container status \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": rpc error: code = NotFound desc = could not find container \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": container with ID starting with d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102760 4883 scope.go:117] "RemoveContainer" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102996 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} err="failed to get container status \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": rpc error: code = NotFound desc = could not find container \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": container with ID starting with 08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103013 4883 scope.go:117] "RemoveContainer" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103246 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} err="failed to get container status \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": rpc error: code = NotFound desc = could not find container \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": container with ID starting with 7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103266 4883 scope.go:117] "RemoveContainer" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103633 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} err="failed to get container status \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": rpc error: code = NotFound desc = could not find container \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": container with ID starting with 836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103659 4883 scope.go:117] "RemoveContainer" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103962 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} err="failed to get container status \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": rpc error: code = NotFound desc = could not find container \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": container with ID starting with 39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103981 4883 scope.go:117] "RemoveContainer" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.104270 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} err="failed to get container status \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": rpc error: code = NotFound desc = could not find container \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": container with ID starting with 548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.104291 4883 scope.go:117] "RemoveContainer" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.104514 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} err="failed to get container status \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": rpc error: code = NotFound desc = could not find container \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": container with ID starting with 131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.104535 4883 scope.go:117] "RemoveContainer" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.104755 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} err="failed to get container status \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": rpc error: code = NotFound desc = could not find container \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": container with ID starting with afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.939621 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"ef18237781cf3eb2b10d931e7aee279eb2a8cc0a999469990928f6bbe7f4b0c2"} Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.939993 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"fdc4eb9ef955c2643f68207ef59ea1567ab6b0f96683426e45ffff4c4d7fcdf7"} Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.940007 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"674e6c1129630fdafe78cd8f2534e7135163fdd79f70b5f0c022fd4ac6f7071a"} Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.940017 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"74a755b6f42093cc0389ec104591dbfc9802ef4f52b3e24519cde248bce23bb0"} Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.940027 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"e3f622dc1c73a5d7d1ba16b7dc9a8be7459f672e58cd877c2d07bb363180f22e"} Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.940035 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"d597f39ccfeea416934eb2235d29690bb961304ca221b81975823d2f3d155b6a"} Mar 10 09:16:08 crc kubenswrapper[4883]: I0310 09:16:08.089004 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" path="/var/lib/kubelet/pods/fc928c48-1df8-4c31-986e-eba2aa7a1c0b/volumes" Mar 10 09:16:09 crc kubenswrapper[4883]: I0310 09:16:09.956048 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"46c4372107ba29a84a898ab5768567eb2230ed222c911c3e103d84a58fbb5627"} Mar 10 09:16:11 crc kubenswrapper[4883]: I0310 09:16:11.975189 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"b1d402380d7cad480196967114f6270018a83ac6af0e3ec4056ea5c3461b3c75"} Mar 10 09:16:11 crc kubenswrapper[4883]: I0310 09:16:11.975752 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:11 crc kubenswrapper[4883]: I0310 09:16:11.975785 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:11 crc kubenswrapper[4883]: I0310 09:16:11.975808 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:12 crc kubenswrapper[4883]: I0310 09:16:12.005402 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:12 crc kubenswrapper[4883]: I0310 09:16:12.010193 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:12 crc kubenswrapper[4883]: I0310 09:16:12.016894 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" podStartSLOduration=6.016879561 podStartE2EDuration="6.016879561s" podCreationTimestamp="2026-03-10 09:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:16:12.011968531 +0000 UTC m=+758.266866420" watchObservedRunningTime="2026-03-10 09:16:12.016879561 +0000 UTC m=+758.271777450" Mar 10 09:16:17 crc kubenswrapper[4883]: I0310 09:16:17.449268 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:16:17 crc kubenswrapper[4883]: I0310 09:16:17.450075 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:16:20 crc kubenswrapper[4883]: I0310 09:16:20.079674 4883 scope.go:117] "RemoveContainer" containerID="5ff25fea28b07754fa577f6bb631a6166978ad82f449fba3657e3971a580bd6d" Mar 10 09:16:20 crc kubenswrapper[4883]: E0310 09:16:20.080276 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-p898z_openshift-multus(8e883c29-520e-4b1f-b49c-3df10450d467)\"" pod="openshift-multus/multus-p898z" podUID="8e883c29-520e-4b1f-b49c-3df10450d467" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.264668 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd"] Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.266895 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.269040 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.276976 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd"] Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.366893 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccdjz\" (UniqueName: \"kubernetes.io/projected/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-kube-api-access-ccdjz\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.366981 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.367012 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.468435 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccdjz\" (UniqueName: \"kubernetes.io/projected/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-kube-api-access-ccdjz\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.468540 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.468567 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.469059 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.469106 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.486466 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccdjz\" (UniqueName: \"kubernetes.io/projected/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-kube-api-access-ccdjz\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.586111 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: E0310 09:16:33.619045 4883 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(19b0cf1042716c8df6c003917583506f8b407b2f7eca648890497c4cd0bd90fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:16:33 crc kubenswrapper[4883]: E0310 09:16:33.619146 4883 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(19b0cf1042716c8df6c003917583506f8b407b2f7eca648890497c4cd0bd90fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: E0310 09:16:33.619179 4883 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(19b0cf1042716c8df6c003917583506f8b407b2f7eca648890497c4cd0bd90fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: E0310 09:16:33.619253 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace(eb61f8a4-ceed-4f2a-91fe-ead52fb416ee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace(eb61f8a4-ceed-4f2a-91fe-ead52fb416ee)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(19b0cf1042716c8df6c003917583506f8b407b2f7eca648890497c4cd0bd90fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" Mar 10 09:16:34 crc kubenswrapper[4883]: I0310 09:16:34.082697 4883 scope.go:117] "RemoveContainer" containerID="5ff25fea28b07754fa577f6bb631a6166978ad82f449fba3657e3971a580bd6d" Mar 10 09:16:34 crc kubenswrapper[4883]: I0310 09:16:34.092963 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:34 crc kubenswrapper[4883]: I0310 09:16:34.093301 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:34 crc kubenswrapper[4883]: E0310 09:16:34.122196 4883 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(d7c0f66777e5209e670ab0846956bba98b9fc5ab89cac08b1f0c3a0fa2ac1126): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:16:34 crc kubenswrapper[4883]: E0310 09:16:34.122265 4883 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(d7c0f66777e5209e670ab0846956bba98b9fc5ab89cac08b1f0c3a0fa2ac1126): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:34 crc kubenswrapper[4883]: E0310 09:16:34.122296 4883 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(d7c0f66777e5209e670ab0846956bba98b9fc5ab89cac08b1f0c3a0fa2ac1126): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:34 crc kubenswrapper[4883]: E0310 09:16:34.122352 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace(eb61f8a4-ceed-4f2a-91fe-ead52fb416ee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace(eb61f8a4-ceed-4f2a-91fe-ead52fb416ee)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(d7c0f66777e5209e670ab0846956bba98b9fc5ab89cac08b1f0c3a0fa2ac1126): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" Mar 10 09:16:34 crc kubenswrapper[4883]: I0310 09:16:34.380335 4883 scope.go:117] "RemoveContainer" containerID="7590615cba141d2df532a5f2b91dc13b678e9424c198e88d350b709d2d0d8639" Mar 10 09:16:34 crc kubenswrapper[4883]: I0310 09:16:34.408003 4883 scope.go:117] "RemoveContainer" containerID="498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0" Mar 10 09:16:35 crc kubenswrapper[4883]: I0310 09:16:35.103160 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/2.log" Mar 10 09:16:35 crc kubenswrapper[4883]: I0310 09:16:35.103244 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerStarted","Data":"49260ae22d078e2736178f52f24f2210a2279bcdd98e9c39923978aa4fc77ff2"} Mar 10 09:16:36 crc kubenswrapper[4883]: I0310 09:16:36.660962 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:47 crc kubenswrapper[4883]: I0310 09:16:47.448834 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:16:47 crc kubenswrapper[4883]: I0310 09:16:47.449620 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:16:48 crc kubenswrapper[4883]: I0310 09:16:48.079285 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:48 crc kubenswrapper[4883]: I0310 09:16:48.079858 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:48 crc kubenswrapper[4883]: I0310 09:16:48.246036 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd"] Mar 10 09:16:49 crc kubenswrapper[4883]: I0310 09:16:49.184249 4883 generic.go:334] "Generic (PLEG): container finished" podID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerID="36e8ac20e381043c167658371a35a780725504424d32e501e74c3d910453459d" exitCode=0 Mar 10 09:16:49 crc kubenswrapper[4883]: I0310 09:16:49.184345 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" event={"ID":"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee","Type":"ContainerDied","Data":"36e8ac20e381043c167658371a35a780725504424d32e501e74c3d910453459d"} Mar 10 09:16:49 crc kubenswrapper[4883]: I0310 09:16:49.184707 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" event={"ID":"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee","Type":"ContainerStarted","Data":"5255d6c98df782f934196c3e1e921929f7c3086234bcf25dd75dd7cf62a47038"} Mar 10 09:16:51 crc kubenswrapper[4883]: I0310 09:16:51.198869 4883 generic.go:334] "Generic (PLEG): container finished" podID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerID="9e7e07ed48b3e382b8d19e5c933331df54943b9bd98226dea20b61c50c91eb15" exitCode=0 Mar 10 09:16:51 crc kubenswrapper[4883]: I0310 09:16:51.198924 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" event={"ID":"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee","Type":"ContainerDied","Data":"9e7e07ed48b3e382b8d19e5c933331df54943b9bd98226dea20b61c50c91eb15"} Mar 10 09:16:52 crc kubenswrapper[4883]: I0310 09:16:52.206913 4883 generic.go:334] "Generic (PLEG): container finished" podID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerID="0f2feaf6b098d019f4e99f5790974e9d1121e3e582a8163ac9049bf43eeb604b" exitCode=0 Mar 10 09:16:52 crc kubenswrapper[4883]: I0310 09:16:52.206997 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" event={"ID":"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee","Type":"ContainerDied","Data":"0f2feaf6b098d019f4e99f5790974e9d1121e3e582a8163ac9049bf43eeb604b"} Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.412871 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.585786 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-util\") pod \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.585836 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccdjz\" (UniqueName: \"kubernetes.io/projected/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-kube-api-access-ccdjz\") pod \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.585958 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-bundle\") pod \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.588192 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-bundle" (OuterVolumeSpecName: "bundle") pod "eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" (UID: "eb61f8a4-ceed-4f2a-91fe-ead52fb416ee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.601316 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-kube-api-access-ccdjz" (OuterVolumeSpecName: "kube-api-access-ccdjz") pod "eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" (UID: "eb61f8a4-ceed-4f2a-91fe-ead52fb416ee"). InnerVolumeSpecName "kube-api-access-ccdjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.606816 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-util" (OuterVolumeSpecName: "util") pod "eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" (UID: "eb61f8a4-ceed-4f2a-91fe-ead52fb416ee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.687917 4883 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-util\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.688242 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccdjz\" (UniqueName: \"kubernetes.io/projected/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-kube-api-access-ccdjz\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.688339 4883 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:54 crc kubenswrapper[4883]: I0310 09:16:54.220872 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" event={"ID":"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee","Type":"ContainerDied","Data":"5255d6c98df782f934196c3e1e921929f7c3086234bcf25dd75dd7cf62a47038"} Mar 10 09:16:54 crc kubenswrapper[4883]: I0310 09:16:54.220925 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5255d6c98df782f934196c3e1e921929f7c3086234bcf25dd75dd7cf62a47038" Mar 10 09:16:54 crc kubenswrapper[4883]: I0310 09:16:54.220947 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.734550 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s"] Mar 10 09:16:59 crc kubenswrapper[4883]: E0310 09:16:59.735994 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="pull" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.736099 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="pull" Mar 10 09:16:59 crc kubenswrapper[4883]: E0310 09:16:59.736168 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="util" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.736232 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="util" Mar 10 09:16:59 crc kubenswrapper[4883]: E0310 09:16:59.736289 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="extract" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.736347 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="extract" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.736534 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="extract" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.737048 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.738797 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.739592 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.739726 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-7xfw4" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.750124 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s"] Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.755865 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq2f2\" (UniqueName: \"kubernetes.io/projected/a776287a-5b99-4f43-8d4c-191108392859-kube-api-access-lq2f2\") pod \"nmstate-operator-75c5dccd6c-k4v4s\" (UID: \"a776287a-5b99-4f43-8d4c-191108392859\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.857048 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq2f2\" (UniqueName: \"kubernetes.io/projected/a776287a-5b99-4f43-8d4c-191108392859-kube-api-access-lq2f2\") pod \"nmstate-operator-75c5dccd6c-k4v4s\" (UID: \"a776287a-5b99-4f43-8d4c-191108392859\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.875174 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq2f2\" (UniqueName: \"kubernetes.io/projected/a776287a-5b99-4f43-8d4c-191108392859-kube-api-access-lq2f2\") pod \"nmstate-operator-75c5dccd6c-k4v4s\" (UID: \"a776287a-5b99-4f43-8d4c-191108392859\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" Mar 10 09:17:00 crc kubenswrapper[4883]: I0310 09:17:00.051032 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" Mar 10 09:17:00 crc kubenswrapper[4883]: I0310 09:17:00.200417 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s"] Mar 10 09:17:00 crc kubenswrapper[4883]: I0310 09:17:00.261110 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" event={"ID":"a776287a-5b99-4f43-8d4c-191108392859","Type":"ContainerStarted","Data":"a6c962e1c0b3a4aa6c68d195bc11275fd828c7232e1e4003e6b3c87f5bce4a71"} Mar 10 09:17:03 crc kubenswrapper[4883]: I0310 09:17:03.284286 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" event={"ID":"a776287a-5b99-4f43-8d4c-191108392859","Type":"ContainerStarted","Data":"9c5ca8e5aa4cec992567a344c1c28ba7a506efdcd66863b5dfb24607c6bb561d"} Mar 10 09:17:03 crc kubenswrapper[4883]: I0310 09:17:03.302040 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" podStartSLOduration=1.8214824219999999 podStartE2EDuration="4.30201833s" podCreationTimestamp="2026-03-10 09:16:59 +0000 UTC" firstStartedPulling="2026-03-10 09:17:00.208088542 +0000 UTC m=+806.462986430" lastFinishedPulling="2026-03-10 09:17:02.688624449 +0000 UTC m=+808.943522338" observedRunningTime="2026-03-10 09:17:03.301147718 +0000 UTC m=+809.556045607" watchObservedRunningTime="2026-03-10 09:17:03.30201833 +0000 UTC m=+809.556916219" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.152438 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-x5lcq"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.153306 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.157787 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-ccbds"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.158597 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.159940 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bnhg7" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.160102 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.166650 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-x5lcq"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.171665 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-ccbds"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.195165 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-5lcxd"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.195815 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.257599 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.258560 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.263752 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.263985 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.265045 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-5d6x9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.268421 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316248 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpns9\" (UniqueName: \"kubernetes.io/projected/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-kube-api-access-fpns9\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316311 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-ovs-socket\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316373 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpckr\" (UniqueName: \"kubernetes.io/projected/291985dd-d623-46ba-9e1b-056dc17d26ed-kube-api-access-zpckr\") pod \"nmstate-metrics-69594cc75-x5lcq\" (UID: \"291985dd-d623-46ba-9e1b-056dc17d26ed\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316398 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/10ab1e00-47a1-4f9a-a55a-131935759d8d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316439 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-nmstate-lock\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316492 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-dbus-socket\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316514 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf67s\" (UniqueName: \"kubernetes.io/projected/10ab1e00-47a1-4f9a-a55a-131935759d8d-kube-api-access-zf67s\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418066 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpckr\" (UniqueName: \"kubernetes.io/projected/291985dd-d623-46ba-9e1b-056dc17d26ed-kube-api-access-zpckr\") pod \"nmstate-metrics-69594cc75-x5lcq\" (UID: \"291985dd-d623-46ba-9e1b-056dc17d26ed\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418110 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/10ab1e00-47a1-4f9a-a55a-131935759d8d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418154 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sgnd\" (UniqueName: \"kubernetes.io/projected/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-kube-api-access-9sgnd\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418182 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418207 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-nmstate-lock\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418250 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-dbus-socket\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418266 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf67s\" (UniqueName: \"kubernetes.io/projected/10ab1e00-47a1-4f9a-a55a-131935759d8d-kube-api-access-zf67s\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: E0310 09:17:04.418290 4883 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418343 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-nmstate-lock\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: E0310 09:17:04.418375 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ab1e00-47a1-4f9a-a55a-131935759d8d-tls-key-pair podName:10ab1e00-47a1-4f9a-a55a-131935759d8d nodeName:}" failed. No retries permitted until 2026-03-10 09:17:04.918354483 +0000 UTC m=+811.173252373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/10ab1e00-47a1-4f9a-a55a-131935759d8d-tls-key-pair") pod "nmstate-webhook-786f45cff4-ccbds" (UID: "10ab1e00-47a1-4f9a-a55a-131935759d8d") : secret "openshift-nmstate-webhook" not found Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418298 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpns9\" (UniqueName: \"kubernetes.io/projected/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-kube-api-access-fpns9\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418583 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-ovs-socket\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418597 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-dbus-socket\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418629 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418669 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-ovs-socket\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.437806 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf67s\" (UniqueName: \"kubernetes.io/projected/10ab1e00-47a1-4f9a-a55a-131935759d8d-kube-api-access-zf67s\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.442161 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpckr\" (UniqueName: \"kubernetes.io/projected/291985dd-d623-46ba-9e1b-056dc17d26ed-kube-api-access-zpckr\") pod \"nmstate-metrics-69594cc75-x5lcq\" (UID: \"291985dd-d623-46ba-9e1b-056dc17d26ed\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.442618 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpns9\" (UniqueName: \"kubernetes.io/projected/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-kube-api-access-fpns9\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.450011 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f5c87b79f-276j9"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.450856 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.465660 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f5c87b79f-276j9"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.467801 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.506103 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519537 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4nbq\" (UniqueName: \"kubernetes.io/projected/0a27925b-cdd8-4de0-9550-c885b528b9e4-kube-api-access-q4nbq\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519578 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-oauth-serving-cert\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519600 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-serving-cert\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519654 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-trusted-ca-bundle\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519688 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519721 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-oauth-config\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519739 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-config\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519877 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sgnd\" (UniqueName: \"kubernetes.io/projected/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-kube-api-access-9sgnd\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519910 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519927 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-service-ca\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.521715 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.525514 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.534418 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sgnd\" (UniqueName: \"kubernetes.io/projected/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-kube-api-access-9sgnd\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.575762 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620440 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-oauth-serving-cert\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620503 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-serving-cert\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620547 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-trusted-ca-bundle\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620586 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-oauth-config\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620606 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-config\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620651 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-service-ca\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620672 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4nbq\" (UniqueName: \"kubernetes.io/projected/0a27925b-cdd8-4de0-9550-c885b528b9e4-kube-api-access-q4nbq\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.621379 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-oauth-serving-cert\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.622005 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-service-ca\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.622412 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-trusted-ca-bundle\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.622438 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-config\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.624910 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-serving-cert\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.625564 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-oauth-config\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.637725 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4nbq\" (UniqueName: \"kubernetes.io/projected/0a27925b-cdd8-4de0-9550-c885b528b9e4-kube-api-access-q4nbq\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.726551 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf"] Mar 10 09:17:04 crc kubenswrapper[4883]: W0310 09:17:04.731551 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod805fc4e3_bab7_415e_a190_0ceeda5bd8b7.slice/crio-3c63283247090812585319257a6867f37a94f47618e410c0c1307e7ab2b11f23 WatchSource:0}: Error finding container 3c63283247090812585319257a6867f37a94f47618e410c0c1307e7ab2b11f23: Status 404 returned error can't find the container with id 3c63283247090812585319257a6867f37a94f47618e410c0c1307e7ab2b11f23 Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.810025 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: W0310 09:17:04.869176 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291985dd_d623_46ba_9e1b_056dc17d26ed.slice/crio-a8d3f2702fd40b7e1efb17bf3b72742b5d45250ba1008c7df6e0a0dcf84f60e6 WatchSource:0}: Error finding container a8d3f2702fd40b7e1efb17bf3b72742b5d45250ba1008c7df6e0a0dcf84f60e6: Status 404 returned error can't find the container with id a8d3f2702fd40b7e1efb17bf3b72742b5d45250ba1008c7df6e0a0dcf84f60e6 Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.871072 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-x5lcq"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.923963 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/10ab1e00-47a1-4f9a-a55a-131935759d8d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.927469 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/10ab1e00-47a1-4f9a-a55a-131935759d8d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.085459 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.175923 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f5c87b79f-276j9"] Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.248920 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-ccbds"] Mar 10 09:17:05 crc kubenswrapper[4883]: W0310 09:17:05.265675 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10ab1e00_47a1_4f9a_a55a_131935759d8d.slice/crio-ca526d8f45e90da3948c284971dc860238dae2b5339d0df09c43df7e1bb14502 WatchSource:0}: Error finding container ca526d8f45e90da3948c284971dc860238dae2b5339d0df09c43df7e1bb14502: Status 404 returned error can't find the container with id ca526d8f45e90da3948c284971dc860238dae2b5339d0df09c43df7e1bb14502 Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.296870 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" event={"ID":"291985dd-d623-46ba-9e1b-056dc17d26ed","Type":"ContainerStarted","Data":"a8d3f2702fd40b7e1efb17bf3b72742b5d45250ba1008c7df6e0a0dcf84f60e6"} Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.298225 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5lcxd" event={"ID":"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32","Type":"ContainerStarted","Data":"610e06fd61703a658e69b2efd6e988a3b049126b6a321ba1f4969be1737eed09"} Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.299457 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" event={"ID":"10ab1e00-47a1-4f9a-a55a-131935759d8d","Type":"ContainerStarted","Data":"ca526d8f45e90da3948c284971dc860238dae2b5339d0df09c43df7e1bb14502"} Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.302124 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f5c87b79f-276j9" event={"ID":"0a27925b-cdd8-4de0-9550-c885b528b9e4","Type":"ContainerStarted","Data":"2794f47035450accbeeb6b159c10062d75e8c0dc020a9dc7c6db25644ec8a7f4"} Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.303060 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" event={"ID":"805fc4e3-bab7-415e-a190-0ceeda5bd8b7","Type":"ContainerStarted","Data":"3c63283247090812585319257a6867f37a94f47618e410c0c1307e7ab2b11f23"} Mar 10 09:17:06 crc kubenswrapper[4883]: I0310 09:17:06.312667 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f5c87b79f-276j9" event={"ID":"0a27925b-cdd8-4de0-9550-c885b528b9e4","Type":"ContainerStarted","Data":"6b37d9d45fcd2e3f298863e909656ecfabcdab5de0e0ca9edabfae145dd305dd"} Mar 10 09:17:06 crc kubenswrapper[4883]: I0310 09:17:06.332993 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f5c87b79f-276j9" podStartSLOduration=2.332970788 podStartE2EDuration="2.332970788s" podCreationTimestamp="2026-03-10 09:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:17:06.328788682 +0000 UTC m=+812.583686571" watchObservedRunningTime="2026-03-10 09:17:06.332970788 +0000 UTC m=+812.587868668" Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.329037 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" event={"ID":"291985dd-d623-46ba-9e1b-056dc17d26ed","Type":"ContainerStarted","Data":"60bba0ac4713e076ae773662f22d057c62e6f792cea8147a81fcc4ad2de183e8"} Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.330501 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5lcxd" event={"ID":"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32","Type":"ContainerStarted","Data":"8e743bf98a4b7c19707c708bbe3ae55e6cd99546ee27ae36c8a9a11adef9c198"} Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.330640 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.332034 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" event={"ID":"10ab1e00-47a1-4f9a-a55a-131935759d8d","Type":"ContainerStarted","Data":"435e91624b781dddcbca037217e0bf0479131e3e631316b8bc63873d53ab0138"} Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.332099 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.334092 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" event={"ID":"805fc4e3-bab7-415e-a190-0ceeda5bd8b7","Type":"ContainerStarted","Data":"5ba26dfc0001a6d7e6030e41a1476883a8b7f7d62267f47a3fff99518b95f655"} Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.346233 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-5lcxd" podStartSLOduration=1.468263403 podStartE2EDuration="4.346216507s" podCreationTimestamp="2026-03-10 09:17:04 +0000 UTC" firstStartedPulling="2026-03-10 09:17:04.527352689 +0000 UTC m=+810.782250579" lastFinishedPulling="2026-03-10 09:17:07.405305795 +0000 UTC m=+813.660203683" observedRunningTime="2026-03-10 09:17:08.346102221 +0000 UTC m=+814.601000111" watchObservedRunningTime="2026-03-10 09:17:08.346216507 +0000 UTC m=+814.601114396" Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.362202 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" podStartSLOduration=1.69055761 podStartE2EDuration="4.362193815s" podCreationTimestamp="2026-03-10 09:17:04 +0000 UTC" firstStartedPulling="2026-03-10 09:17:04.733987558 +0000 UTC m=+810.988885448" lastFinishedPulling="2026-03-10 09:17:07.405623764 +0000 UTC m=+813.660521653" observedRunningTime="2026-03-10 09:17:08.361331738 +0000 UTC m=+814.616229627" watchObservedRunningTime="2026-03-10 09:17:08.362193815 +0000 UTC m=+814.617091703" Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.383876 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" podStartSLOduration=2.2504544969999998 podStartE2EDuration="4.383846372s" podCreationTimestamp="2026-03-10 09:17:04 +0000 UTC" firstStartedPulling="2026-03-10 09:17:05.273796308 +0000 UTC m=+811.528694197" lastFinishedPulling="2026-03-10 09:17:07.407188184 +0000 UTC m=+813.662086072" observedRunningTime="2026-03-10 09:17:08.377926018 +0000 UTC m=+814.632823907" watchObservedRunningTime="2026-03-10 09:17:08.383846372 +0000 UTC m=+814.638744261" Mar 10 09:17:10 crc kubenswrapper[4883]: I0310 09:17:10.359904 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" event={"ID":"291985dd-d623-46ba-9e1b-056dc17d26ed","Type":"ContainerStarted","Data":"7d17991f0f95e570e28efdbbe1e355d52cd0ec8be617f0a9c2372dfae74c99ec"} Mar 10 09:17:10 crc kubenswrapper[4883]: I0310 09:17:10.376640 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" podStartSLOduration=1.267102166 podStartE2EDuration="6.376616702s" podCreationTimestamp="2026-03-10 09:17:04 +0000 UTC" firstStartedPulling="2026-03-10 09:17:04.871243969 +0000 UTC m=+811.126141868" lastFinishedPulling="2026-03-10 09:17:09.980758515 +0000 UTC m=+816.235656404" observedRunningTime="2026-03-10 09:17:10.375269823 +0000 UTC m=+816.630167701" watchObservedRunningTime="2026-03-10 09:17:10.376616702 +0000 UTC m=+816.631514592" Mar 10 09:17:14 crc kubenswrapper[4883]: I0310 09:17:14.531443 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:14 crc kubenswrapper[4883]: I0310 09:17:14.810698 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:14 crc kubenswrapper[4883]: I0310 09:17:14.810767 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:14 crc kubenswrapper[4883]: I0310 09:17:14.815655 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:15 crc kubenswrapper[4883]: I0310 09:17:15.393563 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:15 crc kubenswrapper[4883]: I0310 09:17:15.434252 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nbvf4"] Mar 10 09:17:17 crc kubenswrapper[4883]: I0310 09:17:17.448891 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:17:17 crc kubenswrapper[4883]: I0310 09:17:17.449374 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:17:17 crc kubenswrapper[4883]: I0310 09:17:17.449434 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:17:17 crc kubenswrapper[4883]: I0310 09:17:17.450195 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"263ff00fc2241dc1e88d637d22f488b48ffd25d14a523d99a3849ab10808063c"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:17:17 crc kubenswrapper[4883]: I0310 09:17:17.450261 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://263ff00fc2241dc1e88d637d22f488b48ffd25d14a523d99a3849ab10808063c" gracePeriod=600 Mar 10 09:17:18 crc kubenswrapper[4883]: I0310 09:17:18.410992 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="263ff00fc2241dc1e88d637d22f488b48ffd25d14a523d99a3849ab10808063c" exitCode=0 Mar 10 09:17:18 crc kubenswrapper[4883]: I0310 09:17:18.411090 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"263ff00fc2241dc1e88d637d22f488b48ffd25d14a523d99a3849ab10808063c"} Mar 10 09:17:18 crc kubenswrapper[4883]: I0310 09:17:18.411704 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"7b763722fafe0669f89812126b4b9a1c6fdb578c65fcf3519f299d35b8f8264e"} Mar 10 09:17:18 crc kubenswrapper[4883]: I0310 09:17:18.411758 4883 scope.go:117] "RemoveContainer" containerID="fa3cbb52c5196a50f8eb140640d5ec9255382224cac25f2663c634e049543e13" Mar 10 09:17:25 crc kubenswrapper[4883]: I0310 09:17:25.091903 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:26 crc kubenswrapper[4883]: I0310 09:17:26.112533 4883 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 09:17:36 crc kubenswrapper[4883]: I0310 09:17:36.876435 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5"] Mar 10 09:17:36 crc kubenswrapper[4883]: I0310 09:17:36.878901 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:36 crc kubenswrapper[4883]: I0310 09:17:36.882618 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 09:17:36 crc kubenswrapper[4883]: I0310 09:17:36.887555 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5"] Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.028384 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz22j\" (UniqueName: \"kubernetes.io/projected/8aebf63f-b8d3-496c-a660-c484d574fb63-kube-api-access-mz22j\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.028562 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.028645 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.130168 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.130241 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz22j\" (UniqueName: \"kubernetes.io/projected/8aebf63f-b8d3-496c-a660-c484d574fb63-kube-api-access-mz22j\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.130280 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.130783 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.130820 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.149291 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz22j\" (UniqueName: \"kubernetes.io/projected/8aebf63f-b8d3-496c-a660-c484d574fb63-kube-api-access-mz22j\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.196628 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.351218 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5"] Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.535618 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" event={"ID":"8aebf63f-b8d3-496c-a660-c484d574fb63","Type":"ContainerStarted","Data":"0d126ab653fc0a17d08ec204f3cad246a3ba074a74ae06be7ba81b3541fc4041"} Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.535673 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" event={"ID":"8aebf63f-b8d3-496c-a660-c484d574fb63","Type":"ContainerStarted","Data":"09e5e9966c9e77e8871e74d42d8fc0d89cf7a2a605c521a49f9476f6877182ee"} Mar 10 09:17:38 crc kubenswrapper[4883]: I0310 09:17:38.542306 4883 generic.go:334] "Generic (PLEG): container finished" podID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerID="0d126ab653fc0a17d08ec204f3cad246a3ba074a74ae06be7ba81b3541fc4041" exitCode=0 Mar 10 09:17:38 crc kubenswrapper[4883]: I0310 09:17:38.542365 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" event={"ID":"8aebf63f-b8d3-496c-a660-c484d574fb63","Type":"ContainerDied","Data":"0d126ab653fc0a17d08ec204f3cad246a3ba074a74ae06be7ba81b3541fc4041"} Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.433742 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2mq4z"] Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.435841 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.446850 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2mq4z"] Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.463207 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nbvf4" podUID="dd173309-9e96-468f-a21c-f25c86186744" containerName="console" containerID="cri-o://2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e" gracePeriod=15 Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.554775 4883 generic.go:334] "Generic (PLEG): container finished" podID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerID="930e380027834eadef223b69b206531acf59794b14dc67c94e162258576d9598" exitCode=0 Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.554824 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" event={"ID":"8aebf63f-b8d3-496c-a660-c484d574fb63","Type":"ContainerDied","Data":"930e380027834eadef223b69b206531acf59794b14dc67c94e162258576d9598"} Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.570301 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-catalog-content\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.570385 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-utilities\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.570411 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5bmn\" (UniqueName: \"kubernetes.io/projected/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-kube-api-access-t5bmn\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.671983 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-utilities\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.672034 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5bmn\" (UniqueName: \"kubernetes.io/projected/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-kube-api-access-t5bmn\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.672082 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-catalog-content\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.672532 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-utilities\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.672599 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-catalog-content\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.688495 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5bmn\" (UniqueName: \"kubernetes.io/projected/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-kube-api-access-t5bmn\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.748646 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.757680 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nbvf4_dd173309-9e96-468f-a21c-f25c86186744/console/0.log" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.757751 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876148 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fw5p\" (UniqueName: \"kubernetes.io/projected/dd173309-9e96-468f-a21c-f25c86186744-kube-api-access-8fw5p\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876258 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-console-config\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876311 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-oauth-config\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876348 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-oauth-serving-cert\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876387 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-trusted-ca-bundle\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876406 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-service-ca\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876443 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-serving-cert\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.877276 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-console-config" (OuterVolumeSpecName: "console-config") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.877368 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.877632 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.878724 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-service-ca" (OuterVolumeSpecName: "service-ca") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.881209 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.882341 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd173309-9e96-468f-a21c-f25c86186744-kube-api-access-8fw5p" (OuterVolumeSpecName: "kube-api-access-8fw5p") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "kube-api-access-8fw5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.883216 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.916770 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2mq4z"] Mar 10 09:17:40 crc kubenswrapper[4883]: W0310 09:17:40.923029 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c5e84e5_8671_4388_a92e_6ce1ecab3f48.slice/crio-5d42fd2f91689c5c615a6a94552d6057a6963d3eac75d13eac66d07c14702c45 WatchSource:0}: Error finding container 5d42fd2f91689c5c615a6a94552d6057a6963d3eac75d13eac66d07c14702c45: Status 404 returned error can't find the container with id 5d42fd2f91689c5c615a6a94552d6057a6963d3eac75d13eac66d07c14702c45 Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.977956 4883 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.977986 4883 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.977999 4883 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.978011 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.978020 4883 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.978029 4883 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.978037 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fw5p\" (UniqueName: \"kubernetes.io/projected/dd173309-9e96-468f-a21c-f25c86186744-kube-api-access-8fw5p\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.562445 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nbvf4_dd173309-9e96-468f-a21c-f25c86186744/console/0.log" Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.562793 4883 generic.go:334] "Generic (PLEG): container finished" podID="dd173309-9e96-468f-a21c-f25c86186744" containerID="2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e" exitCode=2 Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.562871 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.562896 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nbvf4" event={"ID":"dd173309-9e96-468f-a21c-f25c86186744","Type":"ContainerDied","Data":"2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e"} Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.562940 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nbvf4" event={"ID":"dd173309-9e96-468f-a21c-f25c86186744","Type":"ContainerDied","Data":"4ec8d1b5827960f6bd137ff58dec19e8986121f159740a144f7e580ab0677ffa"} Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.562976 4883 scope.go:117] "RemoveContainer" containerID="2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e" Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.564942 4883 generic.go:334] "Generic (PLEG): container finished" podID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerID="9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99" exitCode=0 Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.565021 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerDied","Data":"9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99"} Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.565096 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerStarted","Data":"5d42fd2f91689c5c615a6a94552d6057a6963d3eac75d13eac66d07c14702c45"} Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.570674 4883 generic.go:334] "Generic (PLEG): container finished" podID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerID="b3534952ad463b08ff6cd601ba9ff1b1621cb24e5aa9c6acbd423bdc05afdde7" exitCode=0 Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.570719 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" event={"ID":"8aebf63f-b8d3-496c-a660-c484d574fb63","Type":"ContainerDied","Data":"b3534952ad463b08ff6cd601ba9ff1b1621cb24e5aa9c6acbd423bdc05afdde7"} Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.578250 4883 scope.go:117] "RemoveContainer" containerID="2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e" Mar 10 09:17:41 crc kubenswrapper[4883]: E0310 09:17:41.579084 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e\": container with ID starting with 2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e not found: ID does not exist" containerID="2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e" Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.579144 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e"} err="failed to get container status \"2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e\": rpc error: code = NotFound desc = could not find container \"2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e\": container with ID starting with 2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e not found: ID does not exist" Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.631612 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nbvf4"] Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.649098 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nbvf4"] Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.086731 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd173309-9e96-468f-a21c-f25c86186744" path="/var/lib/kubelet/pods/dd173309-9e96-468f-a21c-f25c86186744/volumes" Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.582042 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerStarted","Data":"8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3"} Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.778954 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.905278 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-bundle\") pod \"8aebf63f-b8d3-496c-a660-c484d574fb63\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.905384 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-util\") pod \"8aebf63f-b8d3-496c-a660-c484d574fb63\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.905523 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz22j\" (UniqueName: \"kubernetes.io/projected/8aebf63f-b8d3-496c-a660-c484d574fb63-kube-api-access-mz22j\") pod \"8aebf63f-b8d3-496c-a660-c484d574fb63\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.906211 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-bundle" (OuterVolumeSpecName: "bundle") pod "8aebf63f-b8d3-496c-a660-c484d574fb63" (UID: "8aebf63f-b8d3-496c-a660-c484d574fb63"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.911216 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aebf63f-b8d3-496c-a660-c484d574fb63-kube-api-access-mz22j" (OuterVolumeSpecName: "kube-api-access-mz22j") pod "8aebf63f-b8d3-496c-a660-c484d574fb63" (UID: "8aebf63f-b8d3-496c-a660-c484d574fb63"). InnerVolumeSpecName "kube-api-access-mz22j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.007361 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz22j\" (UniqueName: \"kubernetes.io/projected/8aebf63f-b8d3-496c-a660-c484d574fb63-kube-api-access-mz22j\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.007395 4883 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.094890 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-util" (OuterVolumeSpecName: "util") pod "8aebf63f-b8d3-496c-a660-c484d574fb63" (UID: "8aebf63f-b8d3-496c-a660-c484d574fb63"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.108064 4883 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-util\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.593051 4883 generic.go:334] "Generic (PLEG): container finished" podID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerID="8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3" exitCode=0 Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.593147 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerDied","Data":"8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3"} Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.598718 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" event={"ID":"8aebf63f-b8d3-496c-a660-c484d574fb63","Type":"ContainerDied","Data":"09e5e9966c9e77e8871e74d42d8fc0d89cf7a2a605c521a49f9476f6877182ee"} Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.598785 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09e5e9966c9e77e8871e74d42d8fc0d89cf7a2a605c521a49f9476f6877182ee" Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.598815 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:44 crc kubenswrapper[4883]: I0310 09:17:44.605663 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerStarted","Data":"97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a"} Mar 10 09:17:44 crc kubenswrapper[4883]: I0310 09:17:44.629318 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2mq4z" podStartSLOduration=2.146433301 podStartE2EDuration="4.629302049s" podCreationTimestamp="2026-03-10 09:17:40 +0000 UTC" firstStartedPulling="2026-03-10 09:17:41.566214666 +0000 UTC m=+847.821112555" lastFinishedPulling="2026-03-10 09:17:44.049083414 +0000 UTC m=+850.303981303" observedRunningTime="2026-03-10 09:17:44.626264931 +0000 UTC m=+850.881162821" watchObservedRunningTime="2026-03-10 09:17:44.629302049 +0000 UTC m=+850.884199937" Mar 10 09:17:50 crc kubenswrapper[4883]: I0310 09:17:50.749432 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:50 crc kubenswrapper[4883]: I0310 09:17:50.749797 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:50 crc kubenswrapper[4883]: I0310 09:17:50.784440 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:51 crc kubenswrapper[4883]: I0310 09:17:51.697438 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418234 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn"] Mar 10 09:17:52 crc kubenswrapper[4883]: E0310 09:17:52.418443 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd173309-9e96-468f-a21c-f25c86186744" containerName="console" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418455 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd173309-9e96-468f-a21c-f25c86186744" containerName="console" Mar 10 09:17:52 crc kubenswrapper[4883]: E0310 09:17:52.418511 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="pull" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418518 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="pull" Mar 10 09:17:52 crc kubenswrapper[4883]: E0310 09:17:52.418530 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="extract" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418546 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="extract" Mar 10 09:17:52 crc kubenswrapper[4883]: E0310 09:17:52.418555 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="util" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418560 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="util" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418676 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd173309-9e96-468f-a21c-f25c86186744" containerName="console" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418691 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="extract" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.419046 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.420382 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.421201 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.421364 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.421516 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.421943 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5p6f5" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.433171 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn"] Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.436105 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84c69\" (UniqueName: \"kubernetes.io/projected/5804aa0d-ee19-4fb3-bd39-27c7103571d8-kube-api-access-84c69\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.436190 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5804aa0d-ee19-4fb3-bd39-27c7103571d8-webhook-cert\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.436225 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5804aa0d-ee19-4fb3-bd39-27c7103571d8-apiservice-cert\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.537523 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5804aa0d-ee19-4fb3-bd39-27c7103571d8-webhook-cert\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.537587 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5804aa0d-ee19-4fb3-bd39-27c7103571d8-apiservice-cert\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.537656 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84c69\" (UniqueName: \"kubernetes.io/projected/5804aa0d-ee19-4fb3-bd39-27c7103571d8-kube-api-access-84c69\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.544385 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5804aa0d-ee19-4fb3-bd39-27c7103571d8-webhook-cert\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.544409 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5804aa0d-ee19-4fb3-bd39-27c7103571d8-apiservice-cert\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.554121 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84c69\" (UniqueName: \"kubernetes.io/projected/5804aa0d-ee19-4fb3-bd39-27c7103571d8-kube-api-access-84c69\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.734387 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.870615 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-57848ff665-prp4d"] Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.871593 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.875502 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.875983 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-5zzbs" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.876198 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.882951 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57848ff665-prp4d"] Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.952348 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb05036e-52f2-48ab-ba84-f89c4565a0af-webhook-cert\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.952391 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb05036e-52f2-48ab-ba84-f89c4565a0af-apiservice-cert\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.952609 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n875\" (UniqueName: \"kubernetes.io/projected/cb05036e-52f2-48ab-ba84-f89c4565a0af-kube-api-access-9n875\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.031923 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn"] Mar 10 09:17:53 crc kubenswrapper[4883]: W0310 09:17:53.042551 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5804aa0d_ee19_4fb3_bd39_27c7103571d8.slice/crio-69e140fc9679dd9f8ec82b71ac27155130806adf53b04c436f242093932e69a7 WatchSource:0}: Error finding container 69e140fc9679dd9f8ec82b71ac27155130806adf53b04c436f242093932e69a7: Status 404 returned error can't find the container with id 69e140fc9679dd9f8ec82b71ac27155130806adf53b04c436f242093932e69a7 Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.053608 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n875\" (UniqueName: \"kubernetes.io/projected/cb05036e-52f2-48ab-ba84-f89c4565a0af-kube-api-access-9n875\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.053708 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb05036e-52f2-48ab-ba84-f89c4565a0af-webhook-cert\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.053737 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb05036e-52f2-48ab-ba84-f89c4565a0af-apiservice-cert\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.058094 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb05036e-52f2-48ab-ba84-f89c4565a0af-apiservice-cert\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.060383 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb05036e-52f2-48ab-ba84-f89c4565a0af-webhook-cert\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.069367 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n875\" (UniqueName: \"kubernetes.io/projected/cb05036e-52f2-48ab-ba84-f89c4565a0af-kube-api-access-9n875\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.190377 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.576378 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57848ff665-prp4d"] Mar 10 09:17:53 crc kubenswrapper[4883]: W0310 09:17:53.583917 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb05036e_52f2_48ab_ba84_f89c4565a0af.slice/crio-928c868bf90255745b261f14c3e93fa385f36d5abf5f54e50f9dfcdb77283a4b WatchSource:0}: Error finding container 928c868bf90255745b261f14c3e93fa385f36d5abf5f54e50f9dfcdb77283a4b: Status 404 returned error can't find the container with id 928c868bf90255745b261f14c3e93fa385f36d5abf5f54e50f9dfcdb77283a4b Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.677341 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" event={"ID":"cb05036e-52f2-48ab-ba84-f89c4565a0af","Type":"ContainerStarted","Data":"928c868bf90255745b261f14c3e93fa385f36d5abf5f54e50f9dfcdb77283a4b"} Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.678681 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" event={"ID":"5804aa0d-ee19-4fb3-bd39-27c7103571d8","Type":"ContainerStarted","Data":"69e140fc9679dd9f8ec82b71ac27155130806adf53b04c436f242093932e69a7"} Mar 10 09:17:54 crc kubenswrapper[4883]: I0310 09:17:54.830339 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2mq4z"] Mar 10 09:17:54 crc kubenswrapper[4883]: I0310 09:17:54.831101 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2mq4z" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="registry-server" containerID="cri-o://97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a" gracePeriod=2 Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.264291 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.283096 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-utilities\") pod \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.283144 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5bmn\" (UniqueName: \"kubernetes.io/projected/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-kube-api-access-t5bmn\") pod \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.283164 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-catalog-content\") pod \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.288345 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-utilities" (OuterVolumeSpecName: "utilities") pod "8c5e84e5-8671-4388-a92e-6ce1ecab3f48" (UID: "8c5e84e5-8671-4388-a92e-6ce1ecab3f48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.290340 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-kube-api-access-t5bmn" (OuterVolumeSpecName: "kube-api-access-t5bmn") pod "8c5e84e5-8671-4388-a92e-6ce1ecab3f48" (UID: "8c5e84e5-8671-4388-a92e-6ce1ecab3f48"). InnerVolumeSpecName "kube-api-access-t5bmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.383988 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.384228 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5bmn\" (UniqueName: \"kubernetes.io/projected/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-kube-api-access-t5bmn\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.397000 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c5e84e5-8671-4388-a92e-6ce1ecab3f48" (UID: "8c5e84e5-8671-4388-a92e-6ce1ecab3f48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.484911 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.698584 4883 generic.go:334] "Generic (PLEG): container finished" podID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerID="97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a" exitCode=0 Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.698640 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerDied","Data":"97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a"} Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.698670 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.698715 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerDied","Data":"5d42fd2f91689c5c615a6a94552d6057a6963d3eac75d13eac66d07c14702c45"} Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.698738 4883 scope.go:117] "RemoveContainer" containerID="97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.727880 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2mq4z"] Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.731513 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2mq4z"] Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.963832 4883 scope.go:117] "RemoveContainer" containerID="8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.995393 4883 scope.go:117] "RemoveContainer" containerID="9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.011308 4883 scope.go:117] "RemoveContainer" containerID="97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a" Mar 10 09:17:56 crc kubenswrapper[4883]: E0310 09:17:56.011712 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a\": container with ID starting with 97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a not found: ID does not exist" containerID="97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.011748 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a"} err="failed to get container status \"97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a\": rpc error: code = NotFound desc = could not find container \"97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a\": container with ID starting with 97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a not found: ID does not exist" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.011773 4883 scope.go:117] "RemoveContainer" containerID="8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3" Mar 10 09:17:56 crc kubenswrapper[4883]: E0310 09:17:56.012037 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3\": container with ID starting with 8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3 not found: ID does not exist" containerID="8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.012063 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3"} err="failed to get container status \"8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3\": rpc error: code = NotFound desc = could not find container \"8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3\": container with ID starting with 8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3 not found: ID does not exist" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.012078 4883 scope.go:117] "RemoveContainer" containerID="9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99" Mar 10 09:17:56 crc kubenswrapper[4883]: E0310 09:17:56.012323 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99\": container with ID starting with 9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99 not found: ID does not exist" containerID="9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.012344 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99"} err="failed to get container status \"9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99\": rpc error: code = NotFound desc = could not find container \"9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99\": container with ID starting with 9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99 not found: ID does not exist" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.093573 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" path="/var/lib/kubelet/pods/8c5e84e5-8671-4388-a92e-6ce1ecab3f48/volumes" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.709430 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" event={"ID":"5804aa0d-ee19-4fb3-bd39-27c7103571d8","Type":"ContainerStarted","Data":"c458dd6596c0b4b74519f686fff3a176d4ad08160f8d0443a5d01e94318379ea"} Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.709762 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.730352 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" podStartSLOduration=1.777283781 podStartE2EDuration="4.730326453s" podCreationTimestamp="2026-03-10 09:17:52 +0000 UTC" firstStartedPulling="2026-03-10 09:17:53.045565597 +0000 UTC m=+859.300463487" lastFinishedPulling="2026-03-10 09:17:55.99860827 +0000 UTC m=+862.253506159" observedRunningTime="2026-03-10 09:17:56.726177198 +0000 UTC m=+862.981075087" watchObservedRunningTime="2026-03-10 09:17:56.730326453 +0000 UTC m=+862.985224341" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.032607 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xrpkc"] Mar 10 09:17:57 crc kubenswrapper[4883]: E0310 09:17:57.032872 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="registry-server" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.032886 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="registry-server" Mar 10 09:17:57 crc kubenswrapper[4883]: E0310 09:17:57.032904 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="extract-content" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.032910 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="extract-content" Mar 10 09:17:57 crc kubenswrapper[4883]: E0310 09:17:57.032917 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="extract-utilities" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.032923 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="extract-utilities" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.033039 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="registry-server" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.034964 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.044563 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xrpkc"] Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.114367 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-utilities\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.114455 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb84d\" (UniqueName: \"kubernetes.io/projected/62f58f16-8f76-44eb-8788-eb8664952511-kube-api-access-xb84d\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.114517 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-catalog-content\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.215588 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-catalog-content\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.215673 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-utilities\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.215724 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb84d\" (UniqueName: \"kubernetes.io/projected/62f58f16-8f76-44eb-8788-eb8664952511-kube-api-access-xb84d\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.216093 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-catalog-content\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.216163 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-utilities\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.275444 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb84d\" (UniqueName: \"kubernetes.io/projected/62f58f16-8f76-44eb-8788-eb8664952511-kube-api-access-xb84d\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.357117 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.271572 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xrpkc"] Mar 10 09:17:58 crc kubenswrapper[4883]: W0310 09:17:58.278377 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62f58f16_8f76_44eb_8788_eb8664952511.slice/crio-9a78ebb693346faa79509860e04e83398d7b4673c07eb43e461520f2042f1db5 WatchSource:0}: Error finding container 9a78ebb693346faa79509860e04e83398d7b4673c07eb43e461520f2042f1db5: Status 404 returned error can't find the container with id 9a78ebb693346faa79509860e04e83398d7b4673c07eb43e461520f2042f1db5 Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.725219 4883 generic.go:334] "Generic (PLEG): container finished" podID="62f58f16-8f76-44eb-8788-eb8664952511" containerID="bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3" exitCode=0 Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.725327 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerDied","Data":"bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3"} Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.725400 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerStarted","Data":"9a78ebb693346faa79509860e04e83398d7b4673c07eb43e461520f2042f1db5"} Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.728964 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" event={"ID":"cb05036e-52f2-48ab-ba84-f89c4565a0af","Type":"ContainerStarted","Data":"dc94503e5bf9ef7204b9cec124249434bddc81760853cf33b25fd7933cec8722"} Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.729268 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.764130 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" podStartSLOduration=2.43585861 podStartE2EDuration="6.764107853s" podCreationTimestamp="2026-03-10 09:17:52 +0000 UTC" firstStartedPulling="2026-03-10 09:17:53.587088099 +0000 UTC m=+859.841985988" lastFinishedPulling="2026-03-10 09:17:57.915337343 +0000 UTC m=+864.170235231" observedRunningTime="2026-03-10 09:17:58.759255151 +0000 UTC m=+865.014153041" watchObservedRunningTime="2026-03-10 09:17:58.764107853 +0000 UTC m=+865.019005742" Mar 10 09:17:59 crc kubenswrapper[4883]: I0310 09:17:59.736501 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerStarted","Data":"024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f"} Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.129316 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552238-qbbs2"] Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.130060 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.131560 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.131938 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.135507 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.137158 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552238-qbbs2"] Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.155275 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s6d9\" (UniqueName: \"kubernetes.io/projected/38171111-f624-438d-ba5a-36f6b9cb29bf-kube-api-access-7s6d9\") pod \"auto-csr-approver-29552238-qbbs2\" (UID: \"38171111-f624-438d-ba5a-36f6b9cb29bf\") " pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.255916 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s6d9\" (UniqueName: \"kubernetes.io/projected/38171111-f624-438d-ba5a-36f6b9cb29bf-kube-api-access-7s6d9\") pod \"auto-csr-approver-29552238-qbbs2\" (UID: \"38171111-f624-438d-ba5a-36f6b9cb29bf\") " pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.272240 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s6d9\" (UniqueName: \"kubernetes.io/projected/38171111-f624-438d-ba5a-36f6b9cb29bf-kube-api-access-7s6d9\") pod \"auto-csr-approver-29552238-qbbs2\" (UID: \"38171111-f624-438d-ba5a-36f6b9cb29bf\") " pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.441985 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.632257 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552238-qbbs2"] Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.743015 4883 generic.go:334] "Generic (PLEG): container finished" podID="62f58f16-8f76-44eb-8788-eb8664952511" containerID="024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f" exitCode=0 Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.743108 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerDied","Data":"024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f"} Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.744227 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" event={"ID":"38171111-f624-438d-ba5a-36f6b9cb29bf","Type":"ContainerStarted","Data":"7b7bc77b0a8c57d3cf6064703d8a10ff9c4c2187f99d6a1887470fc740fe96c1"} Mar 10 09:18:01 crc kubenswrapper[4883]: I0310 09:18:01.752212 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerStarted","Data":"ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0"} Mar 10 09:18:01 crc kubenswrapper[4883]: I0310 09:18:01.777520 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xrpkc" podStartSLOduration=2.264522033 podStartE2EDuration="4.777506423s" podCreationTimestamp="2026-03-10 09:17:57 +0000 UTC" firstStartedPulling="2026-03-10 09:17:58.727347801 +0000 UTC m=+864.982245690" lastFinishedPulling="2026-03-10 09:18:01.240332191 +0000 UTC m=+867.495230080" observedRunningTime="2026-03-10 09:18:01.77371319 +0000 UTC m=+868.028611069" watchObservedRunningTime="2026-03-10 09:18:01.777506423 +0000 UTC m=+868.032404313" Mar 10 09:18:02 crc kubenswrapper[4883]: I0310 09:18:02.761760 4883 generic.go:334] "Generic (PLEG): container finished" podID="38171111-f624-438d-ba5a-36f6b9cb29bf" containerID="e20d3f6d5f3aae231c536075cd1098cf482fcd5c0cc1095b975e4d04ba285b0b" exitCode=0 Mar 10 09:18:02 crc kubenswrapper[4883]: I0310 09:18:02.761811 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" event={"ID":"38171111-f624-438d-ba5a-36f6b9cb29bf","Type":"ContainerDied","Data":"e20d3f6d5f3aae231c536075cd1098cf482fcd5c0cc1095b975e4d04ba285b0b"} Mar 10 09:18:03 crc kubenswrapper[4883]: I0310 09:18:03.983100 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:04 crc kubenswrapper[4883]: I0310 09:18:04.002843 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s6d9\" (UniqueName: \"kubernetes.io/projected/38171111-f624-438d-ba5a-36f6b9cb29bf-kube-api-access-7s6d9\") pod \"38171111-f624-438d-ba5a-36f6b9cb29bf\" (UID: \"38171111-f624-438d-ba5a-36f6b9cb29bf\") " Mar 10 09:18:04 crc kubenswrapper[4883]: I0310 09:18:04.008638 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38171111-f624-438d-ba5a-36f6b9cb29bf-kube-api-access-7s6d9" (OuterVolumeSpecName: "kube-api-access-7s6d9") pod "38171111-f624-438d-ba5a-36f6b9cb29bf" (UID: "38171111-f624-438d-ba5a-36f6b9cb29bf"). InnerVolumeSpecName "kube-api-access-7s6d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:18:04 crc kubenswrapper[4883]: I0310 09:18:04.104895 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s6d9\" (UniqueName: \"kubernetes.io/projected/38171111-f624-438d-ba5a-36f6b9cb29bf-kube-api-access-7s6d9\") on node \"crc\" DevicePath \"\"" Mar 10 09:18:04 crc kubenswrapper[4883]: I0310 09:18:04.793047 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" event={"ID":"38171111-f624-438d-ba5a-36f6b9cb29bf","Type":"ContainerDied","Data":"7b7bc77b0a8c57d3cf6064703d8a10ff9c4c2187f99d6a1887470fc740fe96c1"} Mar 10 09:18:04 crc kubenswrapper[4883]: I0310 09:18:04.793103 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b7bc77b0a8c57d3cf6064703d8a10ff9c4c2187f99d6a1887470fc740fe96c1" Mar 10 09:18:04 crc kubenswrapper[4883]: I0310 09:18:04.793121 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:05 crc kubenswrapper[4883]: I0310 09:18:05.032282 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552232-d429x"] Mar 10 09:18:05 crc kubenswrapper[4883]: I0310 09:18:05.036872 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552232-d429x"] Mar 10 09:18:06 crc kubenswrapper[4883]: I0310 09:18:06.085272 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf9a36e-c0e2-4943-a87c-9f6735b2714e" path="/var/lib/kubelet/pods/bbf9a36e-c0e2-4943-a87c-9f6735b2714e/volumes" Mar 10 09:18:07 crc kubenswrapper[4883]: I0310 09:18:07.358364 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:18:07 crc kubenswrapper[4883]: I0310 09:18:07.358435 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:18:07 crc kubenswrapper[4883]: I0310 09:18:07.398300 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:18:07 crc kubenswrapper[4883]: I0310 09:18:07.849796 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:18:09 crc kubenswrapper[4883]: I0310 09:18:09.628049 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xrpkc"] Mar 10 09:18:09 crc kubenswrapper[4883]: I0310 09:18:09.824672 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xrpkc" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="registry-server" containerID="cri-o://ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0" gracePeriod=2 Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.165559 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.285098 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb84d\" (UniqueName: \"kubernetes.io/projected/62f58f16-8f76-44eb-8788-eb8664952511-kube-api-access-xb84d\") pod \"62f58f16-8f76-44eb-8788-eb8664952511\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.285279 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-utilities\") pod \"62f58f16-8f76-44eb-8788-eb8664952511\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.285429 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-catalog-content\") pod \"62f58f16-8f76-44eb-8788-eb8664952511\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.286186 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-utilities" (OuterVolumeSpecName: "utilities") pod "62f58f16-8f76-44eb-8788-eb8664952511" (UID: "62f58f16-8f76-44eb-8788-eb8664952511"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.291799 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f58f16-8f76-44eb-8788-eb8664952511-kube-api-access-xb84d" (OuterVolumeSpecName: "kube-api-access-xb84d") pod "62f58f16-8f76-44eb-8788-eb8664952511" (UID: "62f58f16-8f76-44eb-8788-eb8664952511"). InnerVolumeSpecName "kube-api-access-xb84d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.328266 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62f58f16-8f76-44eb-8788-eb8664952511" (UID: "62f58f16-8f76-44eb-8788-eb8664952511"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.386761 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.386790 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.386803 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb84d\" (UniqueName: \"kubernetes.io/projected/62f58f16-8f76-44eb-8788-eb8664952511-kube-api-access-xb84d\") on node \"crc\" DevicePath \"\"" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.833990 4883 generic.go:334] "Generic (PLEG): container finished" podID="62f58f16-8f76-44eb-8788-eb8664952511" containerID="ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0" exitCode=0 Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.834050 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerDied","Data":"ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0"} Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.834067 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.834115 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerDied","Data":"9a78ebb693346faa79509860e04e83398d7b4673c07eb43e461520f2042f1db5"} Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.834159 4883 scope.go:117] "RemoveContainer" containerID="ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.857960 4883 scope.go:117] "RemoveContainer" containerID="024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.859660 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xrpkc"] Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.863358 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xrpkc"] Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.877494 4883 scope.go:117] "RemoveContainer" containerID="bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.890208 4883 scope.go:117] "RemoveContainer" containerID="ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0" Mar 10 09:18:10 crc kubenswrapper[4883]: E0310 09:18:10.890487 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0\": container with ID starting with ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0 not found: ID does not exist" containerID="ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.890514 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0"} err="failed to get container status \"ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0\": rpc error: code = NotFound desc = could not find container \"ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0\": container with ID starting with ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0 not found: ID does not exist" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.890536 4883 scope.go:117] "RemoveContainer" containerID="024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f" Mar 10 09:18:10 crc kubenswrapper[4883]: E0310 09:18:10.890818 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f\": container with ID starting with 024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f not found: ID does not exist" containerID="024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.890860 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f"} err="failed to get container status \"024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f\": rpc error: code = NotFound desc = could not find container \"024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f\": container with ID starting with 024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f not found: ID does not exist" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.890890 4883 scope.go:117] "RemoveContainer" containerID="bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3" Mar 10 09:18:10 crc kubenswrapper[4883]: E0310 09:18:10.891126 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3\": container with ID starting with bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3 not found: ID does not exist" containerID="bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.891163 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3"} err="failed to get container status \"bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3\": rpc error: code = NotFound desc = could not find container \"bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3\": container with ID starting with bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3 not found: ID does not exist" Mar 10 09:18:12 crc kubenswrapper[4883]: I0310 09:18:12.087200 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62f58f16-8f76-44eb-8788-eb8664952511" path="/var/lib/kubelet/pods/62f58f16-8f76-44eb-8788-eb8664952511/volumes" Mar 10 09:18:13 crc kubenswrapper[4883]: I0310 09:18:13.195179 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:18:32 crc kubenswrapper[4883]: I0310 09:18:32.738870 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.280513 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-ck8gb"] Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.280905 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="registry-server" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.280927 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="registry-server" Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.280955 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="extract-utilities" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.280962 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="extract-utilities" Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.280971 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="extract-content" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.280986 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="extract-content" Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.280996 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38171111-f624-438d-ba5a-36f6b9cb29bf" containerName="oc" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.281001 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="38171111-f624-438d-ba5a-36f6b9cb29bf" containerName="oc" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.281138 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="registry-server" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.281150 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="38171111-f624-438d-ba5a-36f6b9cb29bf" containerName="oc" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.283067 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.284703 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.285193 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.285506 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-gf92g" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.288618 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr"] Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.292156 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.301513 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.309959 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr"] Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.361852 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-gtqfn"] Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.363128 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.371248 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-rtrbh"] Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.373267 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zwvch" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.373444 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.373890 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.374587 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.375388 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377410 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-conf\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377454 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377501 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2caf019-bd64-4a5c-bf88-c260178bdc82-metrics-certs\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377525 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-sockets\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377548 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnz6k\" (UniqueName: \"kubernetes.io/projected/8e843a56-715a-44fc-9974-8570d49bd9a0-kube-api-access-mnz6k\") pod \"frr-k8s-webhook-server-7f989f654f-shjnr\" (UID: \"8e843a56-715a-44fc-9974-8570d49bd9a0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377568 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-reloader\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377619 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfn9k\" (UniqueName: \"kubernetes.io/projected/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-kube-api-access-xfn9k\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377673 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlqm8\" (UniqueName: \"kubernetes.io/projected/d2caf019-bd64-4a5c-bf88-c260178bdc82-kube-api-access-xlqm8\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377688 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-metallb-excludel2\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377702 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-startup\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377729 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-metrics\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377761 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e843a56-715a-44fc-9974-8570d49bd9a0-cert\") pod \"frr-k8s-webhook-server-7f989f654f-shjnr\" (UID: \"8e843a56-715a-44fc-9974-8570d49bd9a0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377784 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-metrics-certs\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.380177 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-rtrbh"] Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.384219 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.479117 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlqm8\" (UniqueName: \"kubernetes.io/projected/d2caf019-bd64-4a5c-bf88-c260178bdc82-kube-api-access-xlqm8\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.479177 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-metallb-excludel2\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.479238 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-startup\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.479854 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d4cl\" (UniqueName: \"kubernetes.io/projected/59437559-8b42-4779-8b72-17f09b50b572-kube-api-access-9d4cl\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.479928 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-metrics\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480047 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e843a56-715a-44fc-9974-8570d49bd9a0-cert\") pod \"frr-k8s-webhook-server-7f989f654f-shjnr\" (UID: \"8e843a56-715a-44fc-9974-8570d49bd9a0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480119 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-metrics-certs\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480153 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59437559-8b42-4779-8b72-17f09b50b572-cert\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480270 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-metallb-excludel2\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480289 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-conf\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480357 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480396 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59437559-8b42-4779-8b72-17f09b50b572-metrics-certs\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480431 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2caf019-bd64-4a5c-bf88-c260178bdc82-metrics-certs\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480455 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-sockets\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480505 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnz6k\" (UniqueName: \"kubernetes.io/projected/8e843a56-715a-44fc-9974-8570d49bd9a0-kube-api-access-mnz6k\") pod \"frr-k8s-webhook-server-7f989f654f-shjnr\" (UID: \"8e843a56-715a-44fc-9974-8570d49bd9a0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480532 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-reloader\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480565 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfn9k\" (UniqueName: \"kubernetes.io/projected/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-kube-api-access-xfn9k\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480778 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-metrics\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.481030 4883 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.481223 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist podName:6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf nodeName:}" failed. No retries permitted until 2026-03-10 09:18:33.981186958 +0000 UTC m=+900.236084847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist") pod "speaker-gtqfn" (UID: "6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf") : secret "metallb-memberlist" not found Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.481682 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-startup\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.481980 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-reloader\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.482120 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-conf\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.482237 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-sockets\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.489235 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2caf019-bd64-4a5c-bf88-c260178bdc82-metrics-certs\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.489261 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e843a56-715a-44fc-9974-8570d49bd9a0-cert\") pod \"frr-k8s-webhook-server-7f989f654f-shjnr\" (UID: \"8e843a56-715a-44fc-9974-8570d49bd9a0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.491409 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-metrics-certs\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.498023 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnz6k\" (UniqueName: \"kubernetes.io/projected/8e843a56-715a-44fc-9974-8570d49bd9a0-kube-api-access-mnz6k\") pod \"frr-k8s-webhook-server-7f989f654f-shjnr\" (UID: \"8e843a56-715a-44fc-9974-8570d49bd9a0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.498463 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfn9k\" (UniqueName: \"kubernetes.io/projected/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-kube-api-access-xfn9k\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.498813 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlqm8\" (UniqueName: \"kubernetes.io/projected/d2caf019-bd64-4a5c-bf88-c260178bdc82-kube-api-access-xlqm8\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.582094 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59437559-8b42-4779-8b72-17f09b50b572-metrics-certs\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.582247 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d4cl\" (UniqueName: \"kubernetes.io/projected/59437559-8b42-4779-8b72-17f09b50b572-kube-api-access-9d4cl\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.582299 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59437559-8b42-4779-8b72-17f09b50b572-cert\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.585342 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59437559-8b42-4779-8b72-17f09b50b572-metrics-certs\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.587822 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59437559-8b42-4779-8b72-17f09b50b572-cert\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.606043 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d4cl\" (UniqueName: \"kubernetes.io/projected/59437559-8b42-4779-8b72-17f09b50b572-kube-api-access-9d4cl\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.619892 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.626692 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.707762 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.987886 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.988094 4883 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.988405 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist podName:6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf nodeName:}" failed. No retries permitted until 2026-03-10 09:18:34.988385321 +0000 UTC m=+901.243283210 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist") pod "speaker-gtqfn" (UID: "6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf") : secret "metallb-memberlist" not found Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.993789 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"c48e6b51db803f7bf091f5c7ab3f4f87e4331cc6ecf296ba1a006b55e6a37b92"} Mar 10 09:18:34 crc kubenswrapper[4883]: W0310 09:18:34.017352 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e843a56_715a_44fc_9974_8570d49bd9a0.slice/crio-5d70e004bba8bc3d5af01c37fa4408ae1cc4b602a74915aeac0b4e410587c372 WatchSource:0}: Error finding container 5d70e004bba8bc3d5af01c37fa4408ae1cc4b602a74915aeac0b4e410587c372: Status 404 returned error can't find the container with id 5d70e004bba8bc3d5af01c37fa4408ae1cc4b602a74915aeac0b4e410587c372 Mar 10 09:18:34 crc kubenswrapper[4883]: I0310 09:18:34.017445 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr"] Mar 10 09:18:34 crc kubenswrapper[4883]: I0310 09:18:34.077936 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-rtrbh"] Mar 10 09:18:34 crc kubenswrapper[4883]: I0310 09:18:34.466601 4883 scope.go:117] "RemoveContainer" containerID="49c1aa583870be3098bda47d15de71e40f64a8b97a906132b01f7c81a5eefc00" Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.010006 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.015834 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.018606 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-rtrbh" event={"ID":"59437559-8b42-4779-8b72-17f09b50b572","Type":"ContainerStarted","Data":"880991a17c808314c7aabf3da988d5cf29fbb2411c907c62cb9d1b158e7b8aa2"} Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.018680 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-rtrbh" event={"ID":"59437559-8b42-4779-8b72-17f09b50b572","Type":"ContainerStarted","Data":"5e81dfe5f80957735dc460558b44fdd2d2817afe61eb3882c49e0cf69087285e"} Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.018694 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-rtrbh" event={"ID":"59437559-8b42-4779-8b72-17f09b50b572","Type":"ContainerStarted","Data":"5adb675d9fa9826ad58302a2eee32aa7abd1fd3c6044f16c288b588b3fb43b8c"} Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.019515 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.023148 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" event={"ID":"8e843a56-715a-44fc-9974-8570d49bd9a0","Type":"ContainerStarted","Data":"5d70e004bba8bc3d5af01c37fa4408ae1cc4b602a74915aeac0b4e410587c372"} Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.036676 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-rtrbh" podStartSLOduration=2.036665355 podStartE2EDuration="2.036665355s" podCreationTimestamp="2026-03-10 09:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:18:35.035760249 +0000 UTC m=+901.290658138" watchObservedRunningTime="2026-03-10 09:18:35.036665355 +0000 UTC m=+901.291563245" Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.204563 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zwvch" Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.213431 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gtqfn" Mar 10 09:18:35 crc kubenswrapper[4883]: W0310 09:18:35.248120 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cecc1fd_5f20_4aff_ae03_570ef8b7dfaf.slice/crio-43ecbda99a36af83821fc853a0701d815e65fb7ee314d8d4ec08d6c8c4052e20 WatchSource:0}: Error finding container 43ecbda99a36af83821fc853a0701d815e65fb7ee314d8d4ec08d6c8c4052e20: Status 404 returned error can't find the container with id 43ecbda99a36af83821fc853a0701d815e65fb7ee314d8d4ec08d6c8c4052e20 Mar 10 09:18:36 crc kubenswrapper[4883]: I0310 09:18:36.047019 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gtqfn" event={"ID":"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf","Type":"ContainerStarted","Data":"915837ce4d089c07163aea4aab7a71e679e1e6a2e7149eea8ef7b499aa944444"} Mar 10 09:18:36 crc kubenswrapper[4883]: I0310 09:18:36.047373 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gtqfn" event={"ID":"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf","Type":"ContainerStarted","Data":"db90983d279711dc94a09992a5c9c9008fa364bc66e5e96b72d128f04ebb263a"} Mar 10 09:18:36 crc kubenswrapper[4883]: I0310 09:18:36.047386 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gtqfn" event={"ID":"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf","Type":"ContainerStarted","Data":"43ecbda99a36af83821fc853a0701d815e65fb7ee314d8d4ec08d6c8c4052e20"} Mar 10 09:18:36 crc kubenswrapper[4883]: I0310 09:18:36.047657 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-gtqfn" Mar 10 09:18:36 crc kubenswrapper[4883]: I0310 09:18:36.071321 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-gtqfn" podStartSLOduration=3.071300969 podStartE2EDuration="3.071300969s" podCreationTimestamp="2026-03-10 09:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:18:36.066756168 +0000 UTC m=+902.321654058" watchObservedRunningTime="2026-03-10 09:18:36.071300969 +0000 UTC m=+902.326198858" Mar 10 09:18:41 crc kubenswrapper[4883]: I0310 09:18:41.081569 4883 generic.go:334] "Generic (PLEG): container finished" podID="d2caf019-bd64-4a5c-bf88-c260178bdc82" containerID="803db553b8a1019048be01f57234e2f5a8121ba433f9f74d82658dbd3059eb65" exitCode=0 Mar 10 09:18:41 crc kubenswrapper[4883]: I0310 09:18:41.081628 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerDied","Data":"803db553b8a1019048be01f57234e2f5a8121ba433f9f74d82658dbd3059eb65"} Mar 10 09:18:41 crc kubenswrapper[4883]: I0310 09:18:41.084501 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" event={"ID":"8e843a56-715a-44fc-9974-8570d49bd9a0","Type":"ContainerStarted","Data":"b469acbbf38a4f61cf3ab9a2a32dcb7fc38e2a8cc18da5844231e0ff0a1c4ce8"} Mar 10 09:18:41 crc kubenswrapper[4883]: I0310 09:18:41.084680 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:41 crc kubenswrapper[4883]: I0310 09:18:41.123616 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" podStartSLOduration=1.920466868 podStartE2EDuration="8.123591205s" podCreationTimestamp="2026-03-10 09:18:33 +0000 UTC" firstStartedPulling="2026-03-10 09:18:34.020150223 +0000 UTC m=+900.275048113" lastFinishedPulling="2026-03-10 09:18:40.22327456 +0000 UTC m=+906.478172450" observedRunningTime="2026-03-10 09:18:41.122157352 +0000 UTC m=+907.377055241" watchObservedRunningTime="2026-03-10 09:18:41.123591205 +0000 UTC m=+907.378489094" Mar 10 09:18:42 crc kubenswrapper[4883]: I0310 09:18:42.091021 4883 generic.go:334] "Generic (PLEG): container finished" podID="d2caf019-bd64-4a5c-bf88-c260178bdc82" containerID="a724b4bc1fe6b2a28586bc3ff26b541d7426f05ec41a110b3a474b8b43217bd4" exitCode=0 Mar 10 09:18:42 crc kubenswrapper[4883]: I0310 09:18:42.091659 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerDied","Data":"a724b4bc1fe6b2a28586bc3ff26b541d7426f05ec41a110b3a474b8b43217bd4"} Mar 10 09:18:42 crc kubenswrapper[4883]: E0310 09:18:42.335931 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2caf019_bd64_4a5c_bf88_c260178bdc82.slice/crio-conmon-c863c467dc5d4ae5a92ff1c179bf60e2a469d4dac4bea0c06dd960d93e869ee9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2caf019_bd64_4a5c_bf88_c260178bdc82.slice/crio-c863c467dc5d4ae5a92ff1c179bf60e2a469d4dac4bea0c06dd960d93e869ee9.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:18:43 crc kubenswrapper[4883]: I0310 09:18:43.099183 4883 generic.go:334] "Generic (PLEG): container finished" podID="d2caf019-bd64-4a5c-bf88-c260178bdc82" containerID="c863c467dc5d4ae5a92ff1c179bf60e2a469d4dac4bea0c06dd960d93e869ee9" exitCode=0 Mar 10 09:18:43 crc kubenswrapper[4883]: I0310 09:18:43.099236 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerDied","Data":"c863c467dc5d4ae5a92ff1c179bf60e2a469d4dac4bea0c06dd960d93e869ee9"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111454 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"483735aae593884e5e74863fd2a40b4e3edc5f0768d90ee57ce189fb79e5b8bf"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111923 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111940 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"38e8b9c85c7171175477d63def63eaa96b2ed4bbaf562fc6287f3913921c4394"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111953 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"d00d94411ea0996a0950a0a12d64c1b22ff349a21906a38b282821b2e3f64c92"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111967 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"94987766b545c56642566b6b8d244dc8407c123659832a2e0d83159820af6c2d"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111977 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"235aa8497009432dfeac99cbc95f7cec78c395b098464539eb1d760ca9a6ae42"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111988 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"4e9d98020cc12ea7ff9b05eedf75c4d4bf6e2462e9b351bd3f30d3e8a8699c67"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.138872 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-ck8gb" podStartSLOduration=4.660067311 podStartE2EDuration="11.138852348s" podCreationTimestamp="2026-03-10 09:18:33 +0000 UTC" firstStartedPulling="2026-03-10 09:18:33.741612368 +0000 UTC m=+899.996510256" lastFinishedPulling="2026-03-10 09:18:40.220397405 +0000 UTC m=+906.475295293" observedRunningTime="2026-03-10 09:18:44.135600506 +0000 UTC m=+910.390498395" watchObservedRunningTime="2026-03-10 09:18:44.138852348 +0000 UTC m=+910.393750237" Mar 10 09:18:45 crc kubenswrapper[4883]: I0310 09:18:45.218630 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-gtqfn" Mar 10 09:18:48 crc kubenswrapper[4883]: I0310 09:18:48.621563 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:48 crc kubenswrapper[4883]: I0310 09:18:48.652355 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.633269 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-kpm7g"] Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.634299 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.636822 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-fk56b" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.636829 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.640162 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.641007 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kpm7g"] Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.652283 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpzqd\" (UniqueName: \"kubernetes.io/projected/245d059e-ae23-4152-a123-75424f7694e8-kube-api-access-vpzqd\") pod \"openstack-operator-index-kpm7g\" (UID: \"245d059e-ae23-4152-a123-75424f7694e8\") " pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.754185 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpzqd\" (UniqueName: \"kubernetes.io/projected/245d059e-ae23-4152-a123-75424f7694e8-kube-api-access-vpzqd\") pod \"openstack-operator-index-kpm7g\" (UID: \"245d059e-ae23-4152-a123-75424f7694e8\") " pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.774532 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpzqd\" (UniqueName: \"kubernetes.io/projected/245d059e-ae23-4152-a123-75424f7694e8-kube-api-access-vpzqd\") pod \"openstack-operator-index-kpm7g\" (UID: \"245d059e-ae23-4152-a123-75424f7694e8\") " pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.952745 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:51 crc kubenswrapper[4883]: I0310 09:18:51.124004 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kpm7g"] Mar 10 09:18:51 crc kubenswrapper[4883]: W0310 09:18:51.128155 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod245d059e_ae23_4152_a123_75424f7694e8.slice/crio-e4fd9244672ea674d347fbc9f7d631b9bbdcee2614a9f81730f802d5fd0ace68 WatchSource:0}: Error finding container e4fd9244672ea674d347fbc9f7d631b9bbdcee2614a9f81730f802d5fd0ace68: Status 404 returned error can't find the container with id e4fd9244672ea674d347fbc9f7d631b9bbdcee2614a9f81730f802d5fd0ace68 Mar 10 09:18:51 crc kubenswrapper[4883]: I0310 09:18:51.152821 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kpm7g" event={"ID":"245d059e-ae23-4152-a123-75424f7694e8","Type":"ContainerStarted","Data":"e4fd9244672ea674d347fbc9f7d631b9bbdcee2614a9f81730f802d5fd0ace68"} Mar 10 09:18:53 crc kubenswrapper[4883]: I0310 09:18:53.168219 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kpm7g" event={"ID":"245d059e-ae23-4152-a123-75424f7694e8","Type":"ContainerStarted","Data":"5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528"} Mar 10 09:18:53 crc kubenswrapper[4883]: I0310 09:18:53.180695 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-kpm7g" podStartSLOduration=1.984774228 podStartE2EDuration="3.180679043s" podCreationTimestamp="2026-03-10 09:18:50 +0000 UTC" firstStartedPulling="2026-03-10 09:18:51.13088212 +0000 UTC m=+917.385780009" lastFinishedPulling="2026-03-10 09:18:52.326786935 +0000 UTC m=+918.581684824" observedRunningTime="2026-03-10 09:18:53.1787581 +0000 UTC m=+919.433655979" watchObservedRunningTime="2026-03-10 09:18:53.180679043 +0000 UTC m=+919.435576921" Mar 10 09:18:53 crc kubenswrapper[4883]: I0310 09:18:53.642726 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:53 crc kubenswrapper[4883]: I0310 09:18:53.644152 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:53 crc kubenswrapper[4883]: I0310 09:18:53.712120 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:55 crc kubenswrapper[4883]: I0310 09:18:55.826080 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kpm7g"] Mar 10 09:18:55 crc kubenswrapper[4883]: I0310 09:18:55.826745 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-kpm7g" podUID="245d059e-ae23-4152-a123-75424f7694e8" containerName="registry-server" containerID="cri-o://5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528" gracePeriod=2 Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.185315 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.187504 4883 generic.go:334] "Generic (PLEG): container finished" podID="245d059e-ae23-4152-a123-75424f7694e8" containerID="5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528" exitCode=0 Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.187590 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.187590 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kpm7g" event={"ID":"245d059e-ae23-4152-a123-75424f7694e8","Type":"ContainerDied","Data":"5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528"} Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.187751 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kpm7g" event={"ID":"245d059e-ae23-4152-a123-75424f7694e8","Type":"ContainerDied","Data":"e4fd9244672ea674d347fbc9f7d631b9bbdcee2614a9f81730f802d5fd0ace68"} Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.187777 4883 scope.go:117] "RemoveContainer" containerID="5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.217293 4883 scope.go:117] "RemoveContainer" containerID="5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528" Mar 10 09:18:56 crc kubenswrapper[4883]: E0310 09:18:56.220955 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528\": container with ID starting with 5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528 not found: ID does not exist" containerID="5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.221000 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528"} err="failed to get container status \"5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528\": rpc error: code = NotFound desc = could not find container \"5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528\": container with ID starting with 5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528 not found: ID does not exist" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.227163 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpzqd\" (UniqueName: \"kubernetes.io/projected/245d059e-ae23-4152-a123-75424f7694e8-kube-api-access-vpzqd\") pod \"245d059e-ae23-4152-a123-75424f7694e8\" (UID: \"245d059e-ae23-4152-a123-75424f7694e8\") " Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.241704 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/245d059e-ae23-4152-a123-75424f7694e8-kube-api-access-vpzqd" (OuterVolumeSpecName: "kube-api-access-vpzqd") pod "245d059e-ae23-4152-a123-75424f7694e8" (UID: "245d059e-ae23-4152-a123-75424f7694e8"). InnerVolumeSpecName "kube-api-access-vpzqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.328489 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpzqd\" (UniqueName: \"kubernetes.io/projected/245d059e-ae23-4152-a123-75424f7694e8-kube-api-access-vpzqd\") on node \"crc\" DevicePath \"\"" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.434781 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-c4vjl"] Mar 10 09:18:56 crc kubenswrapper[4883]: E0310 09:18:56.435393 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245d059e-ae23-4152-a123-75424f7694e8" containerName="registry-server" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.435484 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="245d059e-ae23-4152-a123-75424f7694e8" containerName="registry-server" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.435706 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="245d059e-ae23-4152-a123-75424f7694e8" containerName="registry-server" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.436532 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.439740 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c4vjl"] Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.509565 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kpm7g"] Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.513027 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-kpm7g"] Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.531015 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzk7n\" (UniqueName: \"kubernetes.io/projected/83852eec-509b-4074-b837-4f00d1d07d05-kube-api-access-kzk7n\") pod \"openstack-operator-index-c4vjl\" (UID: \"83852eec-509b-4074-b837-4f00d1d07d05\") " pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.631879 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzk7n\" (UniqueName: \"kubernetes.io/projected/83852eec-509b-4074-b837-4f00d1d07d05-kube-api-access-kzk7n\") pod \"openstack-operator-index-c4vjl\" (UID: \"83852eec-509b-4074-b837-4f00d1d07d05\") " pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.651095 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzk7n\" (UniqueName: \"kubernetes.io/projected/83852eec-509b-4074-b837-4f00d1d07d05-kube-api-access-kzk7n\") pod \"openstack-operator-index-c4vjl\" (UID: \"83852eec-509b-4074-b837-4f00d1d07d05\") " pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.752096 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.985623 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c4vjl"] Mar 10 09:18:57 crc kubenswrapper[4883]: I0310 09:18:57.197444 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c4vjl" event={"ID":"83852eec-509b-4074-b837-4f00d1d07d05","Type":"ContainerStarted","Data":"3b6401b1164602617b1615252e67892602892a83a8f048757c13194f434e9286"} Mar 10 09:18:58 crc kubenswrapper[4883]: I0310 09:18:58.088153 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="245d059e-ae23-4152-a123-75424f7694e8" path="/var/lib/kubelet/pods/245d059e-ae23-4152-a123-75424f7694e8/volumes" Mar 10 09:18:58 crc kubenswrapper[4883]: I0310 09:18:58.207911 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c4vjl" event={"ID":"83852eec-509b-4074-b837-4f00d1d07d05","Type":"ContainerStarted","Data":"61399a1629beea1d008f65c439f95fd1833bac9b7da534b488fe13a2e0f85b97"} Mar 10 09:18:58 crc kubenswrapper[4883]: I0310 09:18:58.222994 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-c4vjl" podStartSLOduration=1.711004628 podStartE2EDuration="2.222975402s" podCreationTimestamp="2026-03-10 09:18:56 +0000 UTC" firstStartedPulling="2026-03-10 09:18:56.9931872 +0000 UTC m=+923.248085089" lastFinishedPulling="2026-03-10 09:18:57.505157974 +0000 UTC m=+923.760055863" observedRunningTime="2026-03-10 09:18:58.219936 +0000 UTC m=+924.474833890" watchObservedRunningTime="2026-03-10 09:18:58.222975402 +0000 UTC m=+924.477873291" Mar 10 09:19:06 crc kubenswrapper[4883]: I0310 09:19:06.753005 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:19:06 crc kubenswrapper[4883]: I0310 09:19:06.753651 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:19:06 crc kubenswrapper[4883]: I0310 09:19:06.781000 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:19:07 crc kubenswrapper[4883]: I0310 09:19:07.289060 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.065331 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g"] Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.066894 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.069520 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vvsfg" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.075397 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g"] Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.101324 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m6fb\" (UniqueName: \"kubernetes.io/projected/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-kube-api-access-7m6fb\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.101377 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.101443 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.203617 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m6fb\" (UniqueName: \"kubernetes.io/projected/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-kube-api-access-7m6fb\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.203711 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.203794 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.204567 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.204591 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.224947 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m6fb\" (UniqueName: \"kubernetes.io/projected/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-kube-api-access-7m6fb\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.387037 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.749366 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g"] Mar 10 09:19:10 crc kubenswrapper[4883]: W0310 09:19:10.752948 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8a84a3_2cd3_452c_9e28_5bfa45be11c1.slice/crio-e00cc9606a6b490d49652f07709a5f047e9ad69c50ad5359cc899edb1188c143 WatchSource:0}: Error finding container e00cc9606a6b490d49652f07709a5f047e9ad69c50ad5359cc899edb1188c143: Status 404 returned error can't find the container with id e00cc9606a6b490d49652f07709a5f047e9ad69c50ad5359cc899edb1188c143 Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.291442 4883 generic.go:334] "Generic (PLEG): container finished" podID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerID="51f283cd117e09208a4768930914421d5662d0e2b76ae05bd47783685cc54eaf" exitCode=0 Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.291522 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" event={"ID":"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1","Type":"ContainerDied","Data":"51f283cd117e09208a4768930914421d5662d0e2b76ae05bd47783685cc54eaf"} Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.291790 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" event={"ID":"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1","Type":"ContainerStarted","Data":"e00cc9606a6b490d49652f07709a5f047e9ad69c50ad5359cc899edb1188c143"} Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.292774 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.437658 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-svjzz"] Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.438948 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.445483 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-svjzz"] Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.520107 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q28dw\" (UniqueName: \"kubernetes.io/projected/7e453b8f-c12a-4f46-9727-af420db90b39-kube-api-access-q28dw\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.520156 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-catalog-content\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.520345 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-utilities\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.621749 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-utilities\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.621849 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q28dw\" (UniqueName: \"kubernetes.io/projected/7e453b8f-c12a-4f46-9727-af420db90b39-kube-api-access-q28dw\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.621902 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-catalog-content\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.622247 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-utilities\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.622422 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-catalog-content\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.638653 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q28dw\" (UniqueName: \"kubernetes.io/projected/7e453b8f-c12a-4f46-9727-af420db90b39-kube-api-access-q28dw\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.753631 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.995594 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-svjzz"] Mar 10 09:19:12 crc kubenswrapper[4883]: W0310 09:19:12.015572 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e453b8f_c12a_4f46_9727_af420db90b39.slice/crio-b37781c041b0a6f6c9f87038fbce7700b9a082ca2d8b50119f237b3c9e6d6b29 WatchSource:0}: Error finding container b37781c041b0a6f6c9f87038fbce7700b9a082ca2d8b50119f237b3c9e6d6b29: Status 404 returned error can't find the container with id b37781c041b0a6f6c9f87038fbce7700b9a082ca2d8b50119f237b3c9e6d6b29 Mar 10 09:19:12 crc kubenswrapper[4883]: I0310 09:19:12.299262 4883 generic.go:334] "Generic (PLEG): container finished" podID="7e453b8f-c12a-4f46-9727-af420db90b39" containerID="b2eb0240dcdf5c7c8c770a8f07fe72cb183377dcb0f81a38200cee2f1f8d2464" exitCode=0 Mar 10 09:19:12 crc kubenswrapper[4883]: I0310 09:19:12.299387 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerDied","Data":"b2eb0240dcdf5c7c8c770a8f07fe72cb183377dcb0f81a38200cee2f1f8d2464"} Mar 10 09:19:12 crc kubenswrapper[4883]: I0310 09:19:12.299589 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerStarted","Data":"b37781c041b0a6f6c9f87038fbce7700b9a082ca2d8b50119f237b3c9e6d6b29"} Mar 10 09:19:13 crc kubenswrapper[4883]: I0310 09:19:13.311248 4883 generic.go:334] "Generic (PLEG): container finished" podID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerID="443c78311a799b12540fdae003b3a40c61b69151091d181fd46a826a7a5dbc48" exitCode=0 Mar 10 09:19:13 crc kubenswrapper[4883]: I0310 09:19:13.311358 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" event={"ID":"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1","Type":"ContainerDied","Data":"443c78311a799b12540fdae003b3a40c61b69151091d181fd46a826a7a5dbc48"} Mar 10 09:19:13 crc kubenswrapper[4883]: I0310 09:19:13.315293 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerStarted","Data":"10600bc19ee71dbc79e67e3d012f88347a66f894fddbad0aba7be8ddb5c9da60"} Mar 10 09:19:14 crc kubenswrapper[4883]: I0310 09:19:14.325376 4883 generic.go:334] "Generic (PLEG): container finished" podID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerID="bf7f3352df4eb679b65f63cc131fc1d1e6fbab83b5f535bcabc0431a7fe48488" exitCode=0 Mar 10 09:19:14 crc kubenswrapper[4883]: I0310 09:19:14.325504 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" event={"ID":"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1","Type":"ContainerDied","Data":"bf7f3352df4eb679b65f63cc131fc1d1e6fbab83b5f535bcabc0431a7fe48488"} Mar 10 09:19:14 crc kubenswrapper[4883]: I0310 09:19:14.327806 4883 generic.go:334] "Generic (PLEG): container finished" podID="7e453b8f-c12a-4f46-9727-af420db90b39" containerID="10600bc19ee71dbc79e67e3d012f88347a66f894fddbad0aba7be8ddb5c9da60" exitCode=0 Mar 10 09:19:14 crc kubenswrapper[4883]: I0310 09:19:14.327845 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerDied","Data":"10600bc19ee71dbc79e67e3d012f88347a66f894fddbad0aba7be8ddb5c9da60"} Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.339008 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerStarted","Data":"dcc56ddb1c3d54c1e1f3561c4de8ae2c9c322ad4aebd1cb8df9e304aabe5bdbc"} Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.360706 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-svjzz" podStartSLOduration=1.833153941 podStartE2EDuration="4.360682349s" podCreationTimestamp="2026-03-10 09:19:11 +0000 UTC" firstStartedPulling="2026-03-10 09:19:12.301827657 +0000 UTC m=+938.556725546" lastFinishedPulling="2026-03-10 09:19:14.829356065 +0000 UTC m=+941.084253954" observedRunningTime="2026-03-10 09:19:15.359824271 +0000 UTC m=+941.614722160" watchObservedRunningTime="2026-03-10 09:19:15.360682349 +0000 UTC m=+941.615580238" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.552131 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.685261 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m6fb\" (UniqueName: \"kubernetes.io/projected/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-kube-api-access-7m6fb\") pod \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.685334 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-util\") pod \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.685607 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-bundle\") pod \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.686367 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-bundle" (OuterVolumeSpecName: "bundle") pod "9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" (UID: "9b8a84a3-2cd3-452c-9e28-5bfa45be11c1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.691655 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-kube-api-access-7m6fb" (OuterVolumeSpecName: "kube-api-access-7m6fb") pod "9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" (UID: "9b8a84a3-2cd3-452c-9e28-5bfa45be11c1"). InnerVolumeSpecName "kube-api-access-7m6fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.696032 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-util" (OuterVolumeSpecName: "util") pod "9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" (UID: "9b8a84a3-2cd3-452c-9e28-5bfa45be11c1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.787165 4883 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.787197 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m6fb\" (UniqueName: \"kubernetes.io/projected/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-kube-api-access-7m6fb\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.787209 4883 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-util\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:16 crc kubenswrapper[4883]: I0310 09:19:16.346538 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:16 crc kubenswrapper[4883]: I0310 09:19:16.346885 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" event={"ID":"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1","Type":"ContainerDied","Data":"e00cc9606a6b490d49652f07709a5f047e9ad69c50ad5359cc899edb1188c143"} Mar 10 09:19:16 crc kubenswrapper[4883]: I0310 09:19:16.346915 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e00cc9606a6b490d49652f07709a5f047e9ad69c50ad5359cc899edb1188c143" Mar 10 09:19:17 crc kubenswrapper[4883]: I0310 09:19:17.449578 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:19:17 crc kubenswrapper[4883]: I0310 09:19:17.449876 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.100155 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8"] Mar 10 09:19:21 crc kubenswrapper[4883]: E0310 09:19:21.101493 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="util" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.101590 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="util" Mar 10 09:19:21 crc kubenswrapper[4883]: E0310 09:19:21.101646 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="pull" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.101694 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="pull" Mar 10 09:19:21 crc kubenswrapper[4883]: E0310 09:19:21.101757 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="extract" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.101804 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="extract" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.101980 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="extract" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.102501 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.104341 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-dhgbx" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.118255 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8"] Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.163400 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnz2d\" (UniqueName: \"kubernetes.io/projected/31e7ec33-4b44-48ce-9f01-e483a7668dd6-kube-api-access-rnz2d\") pod \"openstack-operator-controller-init-6cf8df7788-tzrb8\" (UID: \"31e7ec33-4b44-48ce-9f01-e483a7668dd6\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.264809 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnz2d\" (UniqueName: \"kubernetes.io/projected/31e7ec33-4b44-48ce-9f01-e483a7668dd6-kube-api-access-rnz2d\") pod \"openstack-operator-controller-init-6cf8df7788-tzrb8\" (UID: \"31e7ec33-4b44-48ce-9f01-e483a7668dd6\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.285774 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnz2d\" (UniqueName: \"kubernetes.io/projected/31e7ec33-4b44-48ce-9f01-e483a7668dd6-kube-api-access-rnz2d\") pod \"openstack-operator-controller-init-6cf8df7788-tzrb8\" (UID: \"31e7ec33-4b44-48ce-9f01-e483a7668dd6\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.416633 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.754542 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.754910 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.888944 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8"] Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.911659 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:22 crc kubenswrapper[4883]: I0310 09:19:22.388291 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" event={"ID":"31e7ec33-4b44-48ce-9f01-e483a7668dd6","Type":"ContainerStarted","Data":"89fb5b3a1d23f68f7a9631050a7e369a2662a99ba7164324fe89cf32c693b4e3"} Mar 10 09:19:22 crc kubenswrapper[4883]: I0310 09:19:22.416617 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.028552 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-svjzz"] Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.029133 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-svjzz" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="registry-server" containerID="cri-o://dcc56ddb1c3d54c1e1f3561c4de8ae2c9c322ad4aebd1cb8df9e304aabe5bdbc" gracePeriod=2 Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.412016 4883 generic.go:334] "Generic (PLEG): container finished" podID="7e453b8f-c12a-4f46-9727-af420db90b39" containerID="dcc56ddb1c3d54c1e1f3561c4de8ae2c9c322ad4aebd1cb8df9e304aabe5bdbc" exitCode=0 Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.412274 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerDied","Data":"dcc56ddb1c3d54c1e1f3561c4de8ae2c9c322ad4aebd1cb8df9e304aabe5bdbc"} Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.738418 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.939185 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q28dw\" (UniqueName: \"kubernetes.io/projected/7e453b8f-c12a-4f46-9727-af420db90b39-kube-api-access-q28dw\") pod \"7e453b8f-c12a-4f46-9727-af420db90b39\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.939234 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-utilities\") pod \"7e453b8f-c12a-4f46-9727-af420db90b39\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.939301 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-catalog-content\") pod \"7e453b8f-c12a-4f46-9727-af420db90b39\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.940251 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-utilities" (OuterVolumeSpecName: "utilities") pod "7e453b8f-c12a-4f46-9727-af420db90b39" (UID: "7e453b8f-c12a-4f46-9727-af420db90b39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.944983 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e453b8f-c12a-4f46-9727-af420db90b39-kube-api-access-q28dw" (OuterVolumeSpecName: "kube-api-access-q28dw") pod "7e453b8f-c12a-4f46-9727-af420db90b39" (UID: "7e453b8f-c12a-4f46-9727-af420db90b39"). InnerVolumeSpecName "kube-api-access-q28dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.977333 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e453b8f-c12a-4f46-9727-af420db90b39" (UID: "7e453b8f-c12a-4f46-9727-af420db90b39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.041219 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q28dw\" (UniqueName: \"kubernetes.io/projected/7e453b8f-c12a-4f46-9727-af420db90b39-kube-api-access-q28dw\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.041255 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.041267 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.420634 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerDied","Data":"b37781c041b0a6f6c9f87038fbce7700b9a082ca2d8b50119f237b3c9e6d6b29"} Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.420695 4883 scope.go:117] "RemoveContainer" containerID="dcc56ddb1c3d54c1e1f3561c4de8ae2c9c322ad4aebd1cb8df9e304aabe5bdbc" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.420749 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.437648 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-svjzz"] Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.438497 4883 scope.go:117] "RemoveContainer" containerID="10600bc19ee71dbc79e67e3d012f88347a66f894fddbad0aba7be8ddb5c9da60" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.441171 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-svjzz"] Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.467085 4883 scope.go:117] "RemoveContainer" containerID="b2eb0240dcdf5c7c8c770a8f07fe72cb183377dcb0f81a38200cee2f1f8d2464" Mar 10 09:19:28 crc kubenswrapper[4883]: I0310 09:19:28.087082 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" path="/var/lib/kubelet/pods/7e453b8f-c12a-4f46-9727-af420db90b39/volumes" Mar 10 09:19:31 crc kubenswrapper[4883]: I0310 09:19:31.453919 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" event={"ID":"31e7ec33-4b44-48ce-9f01-e483a7668dd6","Type":"ContainerStarted","Data":"3f27e1efe85e06d1b494a4ba25ff99d7e3d20a593393b2d56bc8d7ee80921fbb"} Mar 10 09:19:31 crc kubenswrapper[4883]: I0310 09:19:31.454404 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:31 crc kubenswrapper[4883]: I0310 09:19:31.484840 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" podStartSLOduration=1.132157941 podStartE2EDuration="10.484818575s" podCreationTimestamp="2026-03-10 09:19:21 +0000 UTC" firstStartedPulling="2026-03-10 09:19:21.896364131 +0000 UTC m=+948.151262021" lastFinishedPulling="2026-03-10 09:19:31.249024766 +0000 UTC m=+957.503922655" observedRunningTime="2026-03-10 09:19:31.478708764 +0000 UTC m=+957.733606652" watchObservedRunningTime="2026-03-10 09:19:31.484818575 +0000 UTC m=+957.739716464" Mar 10 09:19:41 crc kubenswrapper[4883]: I0310 09:19:41.420317 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:47 crc kubenswrapper[4883]: I0310 09:19:47.449659 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:19:47 crc kubenswrapper[4883]: I0310 09:19:47.451108 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.130784 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552240-29nz4"] Mar 10 09:20:00 crc kubenswrapper[4883]: E0310 09:20:00.131446 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="extract-content" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.131459 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="extract-content" Mar 10 09:20:00 crc kubenswrapper[4883]: E0310 09:20:00.131491 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="registry-server" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.131498 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="registry-server" Mar 10 09:20:00 crc kubenswrapper[4883]: E0310 09:20:00.131515 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="extract-utilities" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.131522 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="extract-utilities" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.131624 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="registry-server" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.131997 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.133741 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.134088 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.135803 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552240-29nz4"] Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.138468 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.324557 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c42zw\" (UniqueName: \"kubernetes.io/projected/ed80b911-07e4-45b8-9324-dfdf65e5a508-kube-api-access-c42zw\") pod \"auto-csr-approver-29552240-29nz4\" (UID: \"ed80b911-07e4-45b8-9324-dfdf65e5a508\") " pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.426800 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c42zw\" (UniqueName: \"kubernetes.io/projected/ed80b911-07e4-45b8-9324-dfdf65e5a508-kube-api-access-c42zw\") pod \"auto-csr-approver-29552240-29nz4\" (UID: \"ed80b911-07e4-45b8-9324-dfdf65e5a508\") " pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.446336 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c42zw\" (UniqueName: \"kubernetes.io/projected/ed80b911-07e4-45b8-9324-dfdf65e5a508-kube-api-access-c42zw\") pod \"auto-csr-approver-29552240-29nz4\" (UID: \"ed80b911-07e4-45b8-9324-dfdf65e5a508\") " pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.446596 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.644115 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552240-29nz4"] Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.664374 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552240-29nz4" event={"ID":"ed80b911-07e4-45b8-9324-dfdf65e5a508","Type":"ContainerStarted","Data":"e00c5f24a8113bedb1556c765ed6f8bd88f0d1b5772cf393aac0e1f09785480a"} Mar 10 09:20:02 crc kubenswrapper[4883]: I0310 09:20:02.680529 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552240-29nz4" event={"ID":"ed80b911-07e4-45b8-9324-dfdf65e5a508","Type":"ContainerStarted","Data":"7f58830e2474379901c1e5ffc8128426c4b2e0d855c9a6d20d780eb3f2958fa5"} Mar 10 09:20:02 crc kubenswrapper[4883]: I0310 09:20:02.707195 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552240-29nz4" podStartSLOduration=0.970941138 podStartE2EDuration="2.707181897s" podCreationTimestamp="2026-03-10 09:20:00 +0000 UTC" firstStartedPulling="2026-03-10 09:20:00.650129202 +0000 UTC m=+986.905027091" lastFinishedPulling="2026-03-10 09:20:02.386369961 +0000 UTC m=+988.641267850" observedRunningTime="2026-03-10 09:20:02.703788489 +0000 UTC m=+988.958686378" watchObservedRunningTime="2026-03-10 09:20:02.707181897 +0000 UTC m=+988.962079786" Mar 10 09:20:03 crc kubenswrapper[4883]: I0310 09:20:03.691239 4883 generic.go:334] "Generic (PLEG): container finished" podID="ed80b911-07e4-45b8-9324-dfdf65e5a508" containerID="7f58830e2474379901c1e5ffc8128426c4b2e0d855c9a6d20d780eb3f2958fa5" exitCode=0 Mar 10 09:20:03 crc kubenswrapper[4883]: I0310 09:20:03.691347 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552240-29nz4" event={"ID":"ed80b911-07e4-45b8-9324-dfdf65e5a508","Type":"ContainerDied","Data":"7f58830e2474379901c1e5ffc8128426c4b2e0d855c9a6d20d780eb3f2958fa5"} Mar 10 09:20:04 crc kubenswrapper[4883]: I0310 09:20:04.926755 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.089389 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c42zw\" (UniqueName: \"kubernetes.io/projected/ed80b911-07e4-45b8-9324-dfdf65e5a508-kube-api-access-c42zw\") pod \"ed80b911-07e4-45b8-9324-dfdf65e5a508\" (UID: \"ed80b911-07e4-45b8-9324-dfdf65e5a508\") " Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.095786 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed80b911-07e4-45b8-9324-dfdf65e5a508-kube-api-access-c42zw" (OuterVolumeSpecName: "kube-api-access-c42zw") pod "ed80b911-07e4-45b8-9324-dfdf65e5a508" (UID: "ed80b911-07e4-45b8-9324-dfdf65e5a508"). InnerVolumeSpecName "kube-api-access-c42zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.191177 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c42zw\" (UniqueName: \"kubernetes.io/projected/ed80b911-07e4-45b8-9324-dfdf65e5a508-kube-api-access-c42zw\") on node \"crc\" DevicePath \"\"" Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.704928 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552240-29nz4" event={"ID":"ed80b911-07e4-45b8-9324-dfdf65e5a508","Type":"ContainerDied","Data":"e00c5f24a8113bedb1556c765ed6f8bd88f0d1b5772cf393aac0e1f09785480a"} Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.704974 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.704995 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e00c5f24a8113bedb1556c765ed6f8bd88f0d1b5772cf393aac0e1f09785480a" Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.747211 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552234-ftnh5"] Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.753835 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552234-ftnh5"] Mar 10 09:20:06 crc kubenswrapper[4883]: I0310 09:20:06.087509 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9" path="/var/lib/kubelet/pods/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9/volumes" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.101331 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj"] Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.102078 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed80b911-07e4-45b8-9324-dfdf65e5a508" containerName="oc" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.102092 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed80b911-07e4-45b8-9324-dfdf65e5a508" containerName="oc" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.102219 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed80b911-07e4-45b8-9324-dfdf65e5a508" containerName="oc" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.102595 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.106196 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6mtmj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.110613 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.111591 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.113302 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-2nbzk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.115596 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.119235 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.120071 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.121451 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hscmq" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.125196 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.134610 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2smxq\" (UniqueName: \"kubernetes.io/projected/9a394c48-31ca-4e99-b210-45ae6f67faaa-kube-api-access-2smxq\") pod \"designate-operator-controller-manager-66d56f6ff4-h2cxw\" (UID: \"9a394c48-31ca-4e99-b210-45ae6f67faaa\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.134663 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kwbw\" (UniqueName: \"kubernetes.io/projected/09a04267-a914-4c55-add8-735a053038d3-kube-api-access-7kwbw\") pod \"cinder-operator-controller-manager-984cd4dcf-nzdsk\" (UID: \"09a04267-a914-4c55-add8-735a053038d3\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.134959 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72bds\" (UniqueName: \"kubernetes.io/projected/ac18771f-5f45-40d8-b275-38e2e1c48ba6-kube-api-access-72bds\") pod \"barbican-operator-controller-manager-677bd678f7-q52nj\" (UID: \"ac18771f-5f45-40d8-b275-38e2e1c48ba6\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.142814 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.146655 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.147447 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.150421 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-sk8jb" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.163327 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.164019 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.168005 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-7wflq" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.180552 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.189813 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.190583 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.191817 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-q258g" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.193617 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.199629 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.214935 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.221390 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.221936 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.222022 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.225642 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-hzg6c" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.225789 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.225909 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4ld57" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.227616 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.228382 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.237771 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6mfs8" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.239149 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2smxq\" (UniqueName: \"kubernetes.io/projected/9a394c48-31ca-4e99-b210-45ae6f67faaa-kube-api-access-2smxq\") pod \"designate-operator-controller-manager-66d56f6ff4-h2cxw\" (UID: \"9a394c48-31ca-4e99-b210-45ae6f67faaa\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.239209 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kwbw\" (UniqueName: \"kubernetes.io/projected/09a04267-a914-4c55-add8-735a053038d3-kube-api-access-7kwbw\") pod \"cinder-operator-controller-manager-984cd4dcf-nzdsk\" (UID: \"09a04267-a914-4c55-add8-735a053038d3\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.239316 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72bds\" (UniqueName: \"kubernetes.io/projected/ac18771f-5f45-40d8-b275-38e2e1c48ba6-kube-api-access-72bds\") pod \"barbican-operator-controller-manager-677bd678f7-q52nj\" (UID: \"ac18771f-5f45-40d8-b275-38e2e1c48ba6\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.257650 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.263668 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.296067 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.299627 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2smxq\" (UniqueName: \"kubernetes.io/projected/9a394c48-31ca-4e99-b210-45ae6f67faaa-kube-api-access-2smxq\") pod \"designate-operator-controller-manager-66d56f6ff4-h2cxw\" (UID: \"9a394c48-31ca-4e99-b210-45ae6f67faaa\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.326086 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kwbw\" (UniqueName: \"kubernetes.io/projected/09a04267-a914-4c55-add8-735a053038d3-kube-api-access-7kwbw\") pod \"cinder-operator-controller-manager-984cd4dcf-nzdsk\" (UID: \"09a04267-a914-4c55-add8-735a053038d3\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.329675 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72bds\" (UniqueName: \"kubernetes.io/projected/ac18771f-5f45-40d8-b275-38e2e1c48ba6-kube-api-access-72bds\") pod \"barbican-operator-controller-manager-677bd678f7-q52nj\" (UID: \"ac18771f-5f45-40d8-b275-38e2e1c48ba6\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.344764 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346063 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h6dl\" (UniqueName: \"kubernetes.io/projected/ad93994a-26d2-4353-80be-456c1311020e-kube-api-access-6h6dl\") pod \"keystone-operator-controller-manager-684f77d66d-v5kxw\" (UID: \"ad93994a-26d2-4353-80be-456c1311020e\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346117 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fpv7\" (UniqueName: \"kubernetes.io/projected/c994e4ad-140c-4655-ad69-e4013406d12e-kube-api-access-6fpv7\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346217 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mzzc\" (UniqueName: \"kubernetes.io/projected/bf027c79-6bdb-4cfb-8c31-d785b80e2231-kube-api-access-5mzzc\") pod \"heat-operator-controller-manager-77b6666d85-mbxnn\" (UID: \"bf027c79-6bdb-4cfb-8c31-d785b80e2231\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346240 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8llj\" (UniqueName: \"kubernetes.io/projected/884f7bcb-08ef-49f3-912b-ca921e342615-kube-api-access-l8llj\") pod \"ironic-operator-controller-manager-6bbb499bbc-txdwh\" (UID: \"884f7bcb-08ef-49f3-912b-ca921e342615\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346267 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qls6d\" (UniqueName: \"kubernetes.io/projected/8a4cb5eb-0894-440e-8cfd-448651696a6f-kube-api-access-qls6d\") pod \"horizon-operator-controller-manager-6d9d6b584d-fvwbt\" (UID: \"8a4cb5eb-0894-440e-8cfd-448651696a6f\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346292 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346327 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zmxb\" (UniqueName: \"kubernetes.io/projected/63474f68-d09d-4822-b650-96a37aead592-kube-api-access-5zmxb\") pod \"glance-operator-controller-manager-5964f64c48-w9dbp\" (UID: \"63474f68-d09d-4822-b650-96a37aead592\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.348797 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.354155 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-kxnph" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.358795 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.362718 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.367423 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-mzfp6" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.372949 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.387425 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.394622 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.395729 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.406974 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-m49wc" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.419181 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.433860 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.436400 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.436665 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453212 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62r6d\" (UniqueName: \"kubernetes.io/projected/8b177c77-d85f-4374-b6db-a700719c1282-kube-api-access-62r6d\") pod \"manila-operator-controller-manager-68f45f9d9f-dgrlb\" (UID: \"8b177c77-d85f-4374-b6db-a700719c1282\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453367 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h6dl\" (UniqueName: \"kubernetes.io/projected/ad93994a-26d2-4353-80be-456c1311020e-kube-api-access-6h6dl\") pod \"keystone-operator-controller-manager-684f77d66d-v5kxw\" (UID: \"ad93994a-26d2-4353-80be-456c1311020e\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453456 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fpv7\" (UniqueName: \"kubernetes.io/projected/c994e4ad-140c-4655-ad69-e4013406d12e-kube-api-access-6fpv7\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453574 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s6xm\" (UniqueName: \"kubernetes.io/projected/91415f40-08a2-451b-abe8-38c7b447e66f-kube-api-access-2s6xm\") pod \"neutron-operator-controller-manager-776c5696bf-snvh5\" (UID: \"91415f40-08a2-451b-abe8-38c7b447e66f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453669 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mzzc\" (UniqueName: \"kubernetes.io/projected/bf027c79-6bdb-4cfb-8c31-d785b80e2231-kube-api-access-5mzzc\") pod \"heat-operator-controller-manager-77b6666d85-mbxnn\" (UID: \"bf027c79-6bdb-4cfb-8c31-d785b80e2231\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453737 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8llj\" (UniqueName: \"kubernetes.io/projected/884f7bcb-08ef-49f3-912b-ca921e342615-kube-api-access-l8llj\") pod \"ironic-operator-controller-manager-6bbb499bbc-txdwh\" (UID: \"884f7bcb-08ef-49f3-912b-ca921e342615\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453837 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qls6d\" (UniqueName: \"kubernetes.io/projected/8a4cb5eb-0894-440e-8cfd-448651696a6f-kube-api-access-qls6d\") pod \"horizon-operator-controller-manager-6d9d6b584d-fvwbt\" (UID: \"8a4cb5eb-0894-440e-8cfd-448651696a6f\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453913 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9twc\" (UniqueName: \"kubernetes.io/projected/ec624ec4-966f-410c-95c7-73be0f9cad27-kube-api-access-m9twc\") pod \"mariadb-operator-controller-manager-658d4cdd5-kz9sv\" (UID: \"ec624ec4-966f-410c-95c7-73be0f9cad27\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453983 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.454060 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zmxb\" (UniqueName: \"kubernetes.io/projected/63474f68-d09d-4822-b650-96a37aead592-kube-api-access-5zmxb\") pod \"glance-operator-controller-manager-5964f64c48-w9dbp\" (UID: \"63474f68-d09d-4822-b650-96a37aead592\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.455061 4883 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.455194 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert podName:c994e4ad-140c-4655-ad69-e4013406d12e nodeName:}" failed. No retries permitted until 2026-03-10 09:20:08.955177298 +0000 UTC m=+995.210075187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert") pod "infra-operator-controller-manager-5995f4446f-v6p2d" (UID: "c994e4ad-140c-4655-ad69-e4013406d12e") : secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.458616 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.459570 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.462807 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-2qxwv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.474542 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.475508 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.477178 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zmxb\" (UniqueName: \"kubernetes.io/projected/63474f68-d09d-4822-b650-96a37aead592-kube-api-access-5zmxb\") pod \"glance-operator-controller-manager-5964f64c48-w9dbp\" (UID: \"63474f68-d09d-4822-b650-96a37aead592\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.478641 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bs9kv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.482558 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.497468 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.497952 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h6dl\" (UniqueName: \"kubernetes.io/projected/ad93994a-26d2-4353-80be-456c1311020e-kube-api-access-6h6dl\") pod \"keystone-operator-controller-manager-684f77d66d-v5kxw\" (UID: \"ad93994a-26d2-4353-80be-456c1311020e\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.498535 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.499305 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8llj\" (UniqueName: \"kubernetes.io/projected/884f7bcb-08ef-49f3-912b-ca921e342615-kube-api-access-l8llj\") pod \"ironic-operator-controller-manager-6bbb499bbc-txdwh\" (UID: \"884f7bcb-08ef-49f3-912b-ca921e342615\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.504054 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-f4pm9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.504552 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.510125 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fpv7\" (UniqueName: \"kubernetes.io/projected/c994e4ad-140c-4655-ad69-e4013406d12e-kube-api-access-6fpv7\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.514518 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.518781 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.519188 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mzzc\" (UniqueName: \"kubernetes.io/projected/bf027c79-6bdb-4cfb-8c31-d785b80e2231-kube-api-access-5mzzc\") pod \"heat-operator-controller-manager-77b6666d85-mbxnn\" (UID: \"bf027c79-6bdb-4cfb-8c31-d785b80e2231\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.519804 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.522088 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qls6d\" (UniqueName: \"kubernetes.io/projected/8a4cb5eb-0894-440e-8cfd-448651696a6f-kube-api-access-qls6d\") pod \"horizon-operator-controller-manager-6d9d6b584d-fvwbt\" (UID: \"8a4cb5eb-0894-440e-8cfd-448651696a6f\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.528702 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.529463 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.537556 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-m6wph"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.538429 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.540563 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cggl5" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.540641 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.540882 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-nh9bc" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.542190 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.545784 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.551356 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8flq2" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.551555 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-m6wph"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.557974 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9twc\" (UniqueName: \"kubernetes.io/projected/ec624ec4-966f-410c-95c7-73be0f9cad27-kube-api-access-m9twc\") pod \"mariadb-operator-controller-manager-658d4cdd5-kz9sv\" (UID: \"ec624ec4-966f-410c-95c7-73be0f9cad27\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558026 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8h6h\" (UniqueName: \"kubernetes.io/projected/760c8dff-c64a-492b-a778-45ef16d197bd-kube-api-access-g8h6h\") pod \"nova-operator-controller-manager-569cc54c5-rpwdx\" (UID: \"760c8dff-c64a-492b-a778-45ef16d197bd\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558098 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28zzb\" (UniqueName: \"kubernetes.io/projected/c13f33e2-dd6a-4ca0-91e7-5489c753e273-kube-api-access-28zzb\") pod \"ovn-operator-controller-manager-bbc5b68f9-qnwgj\" (UID: \"c13f33e2-dd6a-4ca0-91e7-5489c753e273\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558140 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62r6d\" (UniqueName: \"kubernetes.io/projected/8b177c77-d85f-4374-b6db-a700719c1282-kube-api-access-62r6d\") pod \"manila-operator-controller-manager-68f45f9d9f-dgrlb\" (UID: \"8b177c77-d85f-4374-b6db-a700719c1282\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558163 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr7k9\" (UniqueName: \"kubernetes.io/projected/d0e08342-2d1b-42d9-921e-1d948f701a58-kube-api-access-fr7k9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-49gjk\" (UID: \"d0e08342-2d1b-42d9-921e-1d948f701a58\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558197 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558288 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dnlh\" (UniqueName: \"kubernetes.io/projected/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-kube-api-access-2dnlh\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558381 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztg6p\" (UniqueName: \"kubernetes.io/projected/1b429bd6-00de-4cc2-8a18-9f58897b6834-kube-api-access-ztg6p\") pod \"swift-operator-controller-manager-677c674df7-m6wph\" (UID: \"1b429bd6-00de-4cc2-8a18-9f58897b6834\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558411 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s6xm\" (UniqueName: \"kubernetes.io/projected/91415f40-08a2-451b-abe8-38c7b447e66f-kube-api-access-2s6xm\") pod \"neutron-operator-controller-manager-776c5696bf-snvh5\" (UID: \"91415f40-08a2-451b-abe8-38c7b447e66f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.559331 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bw2b\" (UniqueName: \"kubernetes.io/projected/04b3aecb-7cfd-4042-b003-4bc8c339aff8-kube-api-access-9bw2b\") pod \"placement-operator-controller-manager-574d45c66c-pppd9\" (UID: \"04b3aecb-7cfd-4042-b003-4bc8c339aff8\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.573559 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.576056 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.578093 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-jhct9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.583376 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.585678 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.596159 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s6xm\" (UniqueName: \"kubernetes.io/projected/91415f40-08a2-451b-abe8-38c7b447e66f-kube-api-access-2s6xm\") pod \"neutron-operator-controller-manager-776c5696bf-snvh5\" (UID: \"91415f40-08a2-451b-abe8-38c7b447e66f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.600121 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9twc\" (UniqueName: \"kubernetes.io/projected/ec624ec4-966f-410c-95c7-73be0f9cad27-kube-api-access-m9twc\") pod \"mariadb-operator-controller-manager-658d4cdd5-kz9sv\" (UID: \"ec624ec4-966f-410c-95c7-73be0f9cad27\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.600321 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.607930 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62r6d\" (UniqueName: \"kubernetes.io/projected/8b177c77-d85f-4374-b6db-a700719c1282-kube-api-access-62r6d\") pod \"manila-operator-controller-manager-68f45f9d9f-dgrlb\" (UID: \"8b177c77-d85f-4374-b6db-a700719c1282\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.625863 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.626794 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.631604 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-nvmll" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.643580 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.661821 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.661854 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dnlh\" (UniqueName: \"kubernetes.io/projected/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-kube-api-access-2dnlh\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.661883 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vwgl\" (UniqueName: \"kubernetes.io/projected/d3d3c04d-7e05-4df2-85c6-394d0bde1a69-kube-api-access-7vwgl\") pod \"test-operator-controller-manager-5c5cb9c4d7-8mpp4\" (UID: \"d3d3c04d-7e05-4df2-85c6-394d0bde1a69\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.661945 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztg6p\" (UniqueName: \"kubernetes.io/projected/1b429bd6-00de-4cc2-8a18-9f58897b6834-kube-api-access-ztg6p\") pod \"swift-operator-controller-manager-677c674df7-m6wph\" (UID: \"1b429bd6-00de-4cc2-8a18-9f58897b6834\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.662013 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bw2b\" (UniqueName: \"kubernetes.io/projected/04b3aecb-7cfd-4042-b003-4bc8c339aff8-kube-api-access-9bw2b\") pod \"placement-operator-controller-manager-574d45c66c-pppd9\" (UID: \"04b3aecb-7cfd-4042-b003-4bc8c339aff8\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.662031 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88nmg\" (UniqueName: \"kubernetes.io/projected/3f4c2998-b51a-4620-b674-60bb0817eb7d-kube-api-access-88nmg\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-mkjnt\" (UID: \"3f4c2998-b51a-4620-b674-60bb0817eb7d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.662064 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8h6h\" (UniqueName: \"kubernetes.io/projected/760c8dff-c64a-492b-a778-45ef16d197bd-kube-api-access-g8h6h\") pod \"nova-operator-controller-manager-569cc54c5-rpwdx\" (UID: \"760c8dff-c64a-492b-a778-45ef16d197bd\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.662208 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28zzb\" (UniqueName: \"kubernetes.io/projected/c13f33e2-dd6a-4ca0-91e7-5489c753e273-kube-api-access-28zzb\") pod \"ovn-operator-controller-manager-bbc5b68f9-qnwgj\" (UID: \"c13f33e2-dd6a-4ca0-91e7-5489c753e273\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.662273 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr7k9\" (UniqueName: \"kubernetes.io/projected/d0e08342-2d1b-42d9-921e-1d948f701a58-kube-api-access-fr7k9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-49gjk\" (UID: \"d0e08342-2d1b-42d9-921e-1d948f701a58\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.662222 4883 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.662435 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert podName:2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f nodeName:}" failed. No retries permitted until 2026-03-10 09:20:09.162416994 +0000 UTC m=+995.417314883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f9f2px" (UID: "2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.681174 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dnlh\" (UniqueName: \"kubernetes.io/projected/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-kube-api-access-2dnlh\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.684710 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.684709 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bw2b\" (UniqueName: \"kubernetes.io/projected/04b3aecb-7cfd-4042-b003-4bc8c339aff8-kube-api-access-9bw2b\") pod \"placement-operator-controller-manager-574d45c66c-pppd9\" (UID: \"04b3aecb-7cfd-4042-b003-4bc8c339aff8\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.685260 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.727661 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztg6p\" (UniqueName: \"kubernetes.io/projected/1b429bd6-00de-4cc2-8a18-9f58897b6834-kube-api-access-ztg6p\") pod \"swift-operator-controller-manager-677c674df7-m6wph\" (UID: \"1b429bd6-00de-4cc2-8a18-9f58897b6834\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.730300 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr7k9\" (UniqueName: \"kubernetes.io/projected/d0e08342-2d1b-42d9-921e-1d948f701a58-kube-api-access-fr7k9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-49gjk\" (UID: \"d0e08342-2d1b-42d9-921e-1d948f701a58\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.731298 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28zzb\" (UniqueName: \"kubernetes.io/projected/c13f33e2-dd6a-4ca0-91e7-5489c753e273-kube-api-access-28zzb\") pod \"ovn-operator-controller-manager-bbc5b68f9-qnwgj\" (UID: \"c13f33e2-dd6a-4ca0-91e7-5489c753e273\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.733384 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.736145 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8h6h\" (UniqueName: \"kubernetes.io/projected/760c8dff-c64a-492b-a778-45ef16d197bd-kube-api-access-g8h6h\") pod \"nova-operator-controller-manager-569cc54c5-rpwdx\" (UID: \"760c8dff-c64a-492b-a778-45ef16d197bd\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.766185 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.780266 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vwgl\" (UniqueName: \"kubernetes.io/projected/d3d3c04d-7e05-4df2-85c6-394d0bde1a69-kube-api-access-7vwgl\") pod \"test-operator-controller-manager-5c5cb9c4d7-8mpp4\" (UID: \"d3d3c04d-7e05-4df2-85c6-394d0bde1a69\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.780529 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88nmg\" (UniqueName: \"kubernetes.io/projected/3f4c2998-b51a-4620-b674-60bb0817eb7d-kube-api-access-88nmg\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-mkjnt\" (UID: \"3f4c2998-b51a-4620-b674-60bb0817eb7d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.786333 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.799833 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88nmg\" (UniqueName: \"kubernetes.io/projected/3f4c2998-b51a-4620-b674-60bb0817eb7d-kube-api-access-88nmg\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-mkjnt\" (UID: \"3f4c2998-b51a-4620-b674-60bb0817eb7d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.801617 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.809709 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.813016 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.814857 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.816120 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.818918 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-j28tm" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.824275 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.833654 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vwgl\" (UniqueName: \"kubernetes.io/projected/d3d3c04d-7e05-4df2-85c6-394d0bde1a69-kube-api-access-7vwgl\") pod \"test-operator-controller-manager-5c5cb9c4d7-8mpp4\" (UID: \"d3d3c04d-7e05-4df2-85c6-394d0bde1a69\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.841669 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.863644 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.864891 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.871315 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rpzgd" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.871535 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.871658 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.875580 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.882727 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qcgt\" (UniqueName: \"kubernetes.io/projected/a7216675-a296-4faa-9dd5-d857b15ffa3c-kube-api-access-9qcgt\") pod \"watcher-operator-controller-manager-6dd88c6f67-rkjsw\" (UID: \"a7216675-a296-4faa-9dd5-d857b15ffa3c\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.891319 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.899078 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.915989 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.918377 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.919455 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.922530 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-mzvkm" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.934233 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.941293 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.989121 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzr7f\" (UniqueName: \"kubernetes.io/projected/969b2d39-fb99-42df-8e6e-3ded5cd292c8-kube-api-access-hzr7f\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.989174 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll9mf\" (UniqueName: \"kubernetes.io/projected/475c1190-6d94-431a-943d-4e749ea87d6b-kube-api-access-ll9mf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pjjsn\" (UID: \"475c1190-6d94-431a-943d-4e749ea87d6b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.989213 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.989240 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.989335 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.989383 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qcgt\" (UniqueName: \"kubernetes.io/projected/a7216675-a296-4faa-9dd5-d857b15ffa3c-kube-api-access-9qcgt\") pod \"watcher-operator-controller-manager-6dd88c6f67-rkjsw\" (UID: \"a7216675-a296-4faa-9dd5-d857b15ffa3c\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.989575 4883 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.989635 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert podName:c994e4ad-140c-4655-ad69-e4013406d12e nodeName:}" failed. No retries permitted until 2026-03-10 09:20:09.989617487 +0000 UTC m=+996.244515376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert") pod "infra-operator-controller-manager-5995f4446f-v6p2d" (UID: "c994e4ad-140c-4655-ad69-e4013406d12e") : secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.017845 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qcgt\" (UniqueName: \"kubernetes.io/projected/a7216675-a296-4faa-9dd5-d857b15ffa3c-kube-api-access-9qcgt\") pod \"watcher-operator-controller-manager-6dd88c6f67-rkjsw\" (UID: \"a7216675-a296-4faa-9dd5-d857b15ffa3c\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.090576 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzr7f\" (UniqueName: \"kubernetes.io/projected/969b2d39-fb99-42df-8e6e-3ded5cd292c8-kube-api-access-hzr7f\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.090623 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll9mf\" (UniqueName: \"kubernetes.io/projected/475c1190-6d94-431a-943d-4e749ea87d6b-kube-api-access-ll9mf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pjjsn\" (UID: \"475c1190-6d94-431a-943d-4e749ea87d6b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.090649 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.090666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.090833 4883 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.090877 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:09.590864376 +0000 UTC m=+995.845762265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.091463 4883 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.091550 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:09.591535281 +0000 UTC m=+995.846433170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "metrics-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.109279 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzr7f\" (UniqueName: \"kubernetes.io/projected/969b2d39-fb99-42df-8e6e-3ded5cd292c8-kube-api-access-hzr7f\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.110492 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll9mf\" (UniqueName: \"kubernetes.io/projected/475c1190-6d94-431a-943d-4e749ea87d6b-kube-api-access-ll9mf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pjjsn\" (UID: \"475c1190-6d94-431a-943d-4e749ea87d6b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.191168 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.191748 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.191973 4883 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.192025 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert podName:2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f nodeName:}" failed. No retries permitted until 2026-03-10 09:20:10.192010125 +0000 UTC m=+996.446908014 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f9f2px" (UID: "2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.239255 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.413225 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.419323 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.422711 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw"] Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.439705 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad93994a_26d2_4353_80be_456c1311020e.slice/crio-5b38c901f6b7f89ef7b83e3b72b180ef7efe84c1ca1ac084fa760daeb3bbbb25 WatchSource:0}: Error finding container 5b38c901f6b7f89ef7b83e3b72b180ef7efe84c1ca1ac084fa760daeb3bbbb25: Status 404 returned error can't find the container with id 5b38c901f6b7f89ef7b83e3b72b180ef7efe84c1ca1ac084fa760daeb3bbbb25 Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.500616 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.504968 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.573608 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.578779 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.583496 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt"] Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.585902 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04b3aecb_7cfd_4042_b003_4bc8c339aff8.slice/crio-60bbb8bffb79515b3a41c2c1f2b07bef370cae406c41ccd07dc12cc8fb433b6c WatchSource:0}: Error finding container 60bbb8bffb79515b3a41c2c1f2b07bef370cae406c41ccd07dc12cc8fb433b6c: Status 404 returned error can't find the container with id 60bbb8bffb79515b3a41c2c1f2b07bef370cae406c41ccd07dc12cc8fb433b6c Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.586296 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a4cb5eb_0894_440e_8cfd_448651696a6f.slice/crio-e9edadaaaaecb94eeaf9e713b4d29c8e53243c97e501e6db6e0f86f8d185c531 WatchSource:0}: Error finding container e9edadaaaaecb94eeaf9e713b4d29c8e53243c97e501e6db6e0f86f8d185c531: Status 404 returned error can't find the container with id e9edadaaaaecb94eeaf9e713b4d29c8e53243c97e501e6db6e0f86f8d185c531 Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.595955 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.598980 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.599023 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.599191 4883 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.599245 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:10.599229869 +0000 UTC m=+996.854127759 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.599279 4883 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.599384 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:10.599359093 +0000 UTC m=+996.854256982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "metrics-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.600013 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf027c79_6bdb_4cfb_8c31_d785b80e2231.slice/crio-44cd699c14bc3891cde617e4b29e7c7a5c4ee18b7a9fe3dcaff0fb824962684a WatchSource:0}: Error finding container 44cd699c14bc3891cde617e4b29e7c7a5c4ee18b7a9fe3dcaff0fb824962684a: Status 404 returned error can't find the container with id 44cd699c14bc3891cde617e4b29e7c7a5c4ee18b7a9fe3dcaff0fb824962684a Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.605554 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx"] Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.611397 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2s6xm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-776c5696bf-snvh5_openstack-operators(91415f40-08a2-451b-abe8-38c7b447e66f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.612885 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" podUID="91415f40-08a2-451b-abe8-38c7b447e66f" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.614724 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.618632 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb"] Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.620780 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b177c77_d85f_4374_b6db_a700719c1282.slice/crio-70a38461ea45c63c7783503e6dd9bde5a86b428afbbe53a9331303eb17d10568 WatchSource:0}: Error finding container 70a38461ea45c63c7783503e6dd9bde5a86b428afbbe53a9331303eb17d10568: Status 404 returned error can't find the container with id 70a38461ea45c63c7783503e6dd9bde5a86b428afbbe53a9331303eb17d10568 Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.622100 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv"] Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.622843 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec624ec4_966f_410c_95c7_73be0f9cad27.slice/crio-8ed5f266616163b2dab0d917ea7973f91352898c5fc26400c3adc4f5c0fc8803 WatchSource:0}: Error finding container 8ed5f266616163b2dab0d917ea7973f91352898c5fc26400c3adc4f5c0fc8803: Status 404 returned error can't find the container with id 8ed5f266616163b2dab0d917ea7973f91352898c5fc26400c3adc4f5c0fc8803 Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.623169 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-62r6d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-68f45f9d9f-dgrlb_openstack-operators(8b177c77-d85f-4374-b6db-a700719c1282): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.624859 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" podUID="8b177c77-d85f-4374-b6db-a700719c1282" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.627588 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m9twc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-658d4cdd5-kz9sv_openstack-operators(ec624ec4-966f-410c-95c7-73be0f9cad27): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.628811 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" podUID="ec624ec4-966f-410c-95c7-73be0f9cad27" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.769091 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" event={"ID":"09a04267-a914-4c55-add8-735a053038d3","Type":"ContainerStarted","Data":"57b7d167675581f7d77dbe6719ce7571cfacfbf3a9df39d8dcb201f7b39c4efd"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.773620 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" event={"ID":"884f7bcb-08ef-49f3-912b-ca921e342615","Type":"ContainerStarted","Data":"47b69c6469e9c0fda49dc764279cc1c2fadd463dd8a89ae2d0549de32d4aaede"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.775073 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" event={"ID":"bf027c79-6bdb-4cfb-8c31-d785b80e2231","Type":"ContainerStarted","Data":"44cd699c14bc3891cde617e4b29e7c7a5c4ee18b7a9fe3dcaff0fb824962684a"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.775116 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.776145 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" event={"ID":"9a394c48-31ca-4e99-b210-45ae6f67faaa","Type":"ContainerStarted","Data":"168479b3a9b0d2df2917b7917bf2f844296689cced63fd65042a333c2530e9f0"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.776917 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" event={"ID":"63474f68-d09d-4822-b650-96a37aead592","Type":"ContainerStarted","Data":"83d1f4358fd5f31070cdf493abed356016a218487b8aac79e0f7df81791ff9fd"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.779591 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.779993 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" event={"ID":"8a4cb5eb-0894-440e-8cfd-448651696a6f","Type":"ContainerStarted","Data":"e9edadaaaaecb94eeaf9e713b4d29c8e53243c97e501e6db6e0f86f8d185c531"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.783336 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk"] Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.789462 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-88nmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-mkjnt_openstack-operators(3f4c2998-b51a-4620-b674-60bb0817eb7d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.789470 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fr7k9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-49gjk_openstack-operators(d0e08342-2d1b-42d9-921e-1d948f701a58): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.790252 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" event={"ID":"04b3aecb-7cfd-4042-b003-4bc8c339aff8","Type":"ContainerStarted","Data":"60bbb8bffb79515b3a41c2c1f2b07bef370cae406c41ccd07dc12cc8fb433b6c"} Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.790793 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" podUID="d0e08342-2d1b-42d9-921e-1d948f701a58" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.790839 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" podUID="3f4c2998-b51a-4620-b674-60bb0817eb7d" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.792998 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj"] Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.795927 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13f33e2_dd6a_4ca0_91e7_5489c753e273.slice/crio-95d2af89ebbe9034f960439b27501b09e2bac4eb8f1ce5a69820634941438262 WatchSource:0}: Error finding container 95d2af89ebbe9034f960439b27501b09e2bac4eb8f1ce5a69820634941438262: Status 404 returned error can't find the container with id 95d2af89ebbe9034f960439b27501b09e2bac4eb8f1ce5a69820634941438262 Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.796143 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" event={"ID":"91415f40-08a2-451b-abe8-38c7b447e66f","Type":"ContainerStarted","Data":"7a894b2192ecbc4bf04b66f086babcfd753f613e3042b73a33afc3dff20e3446"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.797348 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" event={"ID":"ec624ec4-966f-410c-95c7-73be0f9cad27","Type":"ContainerStarted","Data":"8ed5f266616163b2dab0d917ea7973f91352898c5fc26400c3adc4f5c0fc8803"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.799089 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" event={"ID":"760c8dff-c64a-492b-a778-45ef16d197bd","Type":"ContainerStarted","Data":"54c69f7c05dc1ca1473350fb37daec194b8955b9bca06697380d6641a56bf5ba"} Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.800456 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" podUID="91415f40-08a2-451b-abe8-38c7b447e66f" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.800979 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" podUID="ec624ec4-966f-410c-95c7-73be0f9cad27" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.801405 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" event={"ID":"8b177c77-d85f-4374-b6db-a700719c1282","Type":"ContainerStarted","Data":"70a38461ea45c63c7783503e6dd9bde5a86b428afbbe53a9331303eb17d10568"} Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.802664 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b429bd6_00de_4cc2_8a18_9f58897b6834.slice/crio-4eb7fb72943ec2e70cd70ab6beea51a9c9dc109f24845cd41cef0da72a4ae727 WatchSource:0}: Error finding container 4eb7fb72943ec2e70cd70ab6beea51a9c9dc109f24845cd41cef0da72a4ae727: Status 404 returned error can't find the container with id 4eb7fb72943ec2e70cd70ab6beea51a9c9dc109f24845cd41cef0da72a4ae727 Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.802786 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4\\\"\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" podUID="8b177c77-d85f-4374-b6db-a700719c1282" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.803229 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" event={"ID":"ac18771f-5f45-40d8-b275-38e2e1c48ba6","Type":"ContainerStarted","Data":"0d3e72f814efc2be6fc92c843167a3a5ca521b1a88c10692ca238e1474290b62"} Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.803315 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28zzb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-qnwgj_openstack-operators(c13f33e2-dd6a-4ca0-91e7-5489c753e273): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.803836 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7216675_a296_4faa_9dd5_d857b15ffa3c.slice/crio-e562d4a75b9b0f77867ac5d95482e2f08321e1371a7015804516057212f015a4 WatchSource:0}: Error finding container e562d4a75b9b0f77867ac5d95482e2f08321e1371a7015804516057212f015a4: Status 404 returned error can't find the container with id e562d4a75b9b0f77867ac5d95482e2f08321e1371a7015804516057212f015a4 Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.804415 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" podUID="c13f33e2-dd6a-4ca0-91e7-5489c753e273" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.805383 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" event={"ID":"ad93994a-26d2-4353-80be-456c1311020e","Type":"ContainerStarted","Data":"5b38c901f6b7f89ef7b83e3b72b180ef7efe84c1ca1ac084fa760daeb3bbbb25"} Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.806309 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9qcgt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-rkjsw_openstack-operators(a7216675-a296-4faa-9dd5-d857b15ffa3c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.807142 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ztg6p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-m6wph_openstack-operators(1b429bd6-00de-4cc2-8a18-9f58897b6834): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.808151 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" podUID="a7216675-a296-4faa-9dd5-d857b15ffa3c" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.808326 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" podUID="1b429bd6-00de-4cc2-8a18-9f58897b6834" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.815567 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw"] Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.818599 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ll9mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pjjsn_openstack-operators(475c1190-6d94-431a-943d-4e749ea87d6b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.819864 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" podUID="475c1190-6d94-431a-943d-4e749ea87d6b" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.821940 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-m6wph"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.829115 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn"] Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.004791 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.005017 4883 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.005126 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert podName:c994e4ad-140c-4655-ad69-e4013406d12e nodeName:}" failed. No retries permitted until 2026-03-10 09:20:12.005103025 +0000 UTC m=+998.260000914 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert") pod "infra-operator-controller-manager-5995f4446f-v6p2d" (UID: "c994e4ad-140c-4655-ad69-e4013406d12e") : secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.208371 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.208684 4883 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.208822 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert podName:2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f nodeName:}" failed. No retries permitted until 2026-03-10 09:20:12.208791216 +0000 UTC m=+998.463689104 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f9f2px" (UID: "2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.614270 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.614634 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.614510 4883 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.614776 4883 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.614779 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:12.614757447 +0000 UTC m=+998.869655337 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "metrics-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.614851 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:12.614836346 +0000 UTC m=+998.869734235 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "webhook-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.816301 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" event={"ID":"1b429bd6-00de-4cc2-8a18-9f58897b6834","Type":"ContainerStarted","Data":"4eb7fb72943ec2e70cd70ab6beea51a9c9dc109f24845cd41cef0da72a4ae727"} Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.818129 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" podUID="1b429bd6-00de-4cc2-8a18-9f58897b6834" Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.818510 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" event={"ID":"3f4c2998-b51a-4620-b674-60bb0817eb7d","Type":"ContainerStarted","Data":"9e6c01249661be687c2ade9349da3c4b471b06c572b3480c83adf1edb0dbdb75"} Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.819765 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" podUID="3f4c2998-b51a-4620-b674-60bb0817eb7d" Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.821545 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" event={"ID":"d0e08342-2d1b-42d9-921e-1d948f701a58","Type":"ContainerStarted","Data":"83be3ce4f1134f8106165b66a365c7f6385a705befea29b15acd0ad9c321bea9"} Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.822729 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" podUID="d0e08342-2d1b-42d9-921e-1d948f701a58" Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.823095 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" event={"ID":"a7216675-a296-4faa-9dd5-d857b15ffa3c","Type":"ContainerStarted","Data":"e562d4a75b9b0f77867ac5d95482e2f08321e1371a7015804516057212f015a4"} Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.828189 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" event={"ID":"c13f33e2-dd6a-4ca0-91e7-5489c753e273","Type":"ContainerStarted","Data":"95d2af89ebbe9034f960439b27501b09e2bac4eb8f1ce5a69820634941438262"} Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.828769 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" podUID="a7216675-a296-4faa-9dd5-d857b15ffa3c" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.829835 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" podUID="c13f33e2-dd6a-4ca0-91e7-5489c753e273" Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.831942 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" event={"ID":"d3d3c04d-7e05-4df2-85c6-394d0bde1a69","Type":"ContainerStarted","Data":"20419ef0a719d8f320d143264e1238ecc015d808842afc2b967ed0ef75b655ec"} Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.835210 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" event={"ID":"475c1190-6d94-431a-943d-4e749ea87d6b","Type":"ContainerStarted","Data":"ae1c00df4359638bd98f4acdf16b26bc4b854e6f9f48cdab9c86749e576e2478"} Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.838879 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" podUID="475c1190-6d94-431a-943d-4e749ea87d6b" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.838944 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" podUID="91415f40-08a2-451b-abe8-38c7b447e66f" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.838987 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" podUID="ec624ec4-966f-410c-95c7-73be0f9cad27" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.842107 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4\\\"\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" podUID="8b177c77-d85f-4374-b6db-a700719c1282" Mar 10 09:20:11 crc kubenswrapper[4883]: E0310 09:20:11.847737 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" podUID="475c1190-6d94-431a-943d-4e749ea87d6b" Mar 10 09:20:11 crc kubenswrapper[4883]: E0310 09:20:11.848265 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" podUID="a7216675-a296-4faa-9dd5-d857b15ffa3c" Mar 10 09:20:11 crc kubenswrapper[4883]: E0310 09:20:11.848313 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" podUID="1b429bd6-00de-4cc2-8a18-9f58897b6834" Mar 10 09:20:11 crc kubenswrapper[4883]: E0310 09:20:11.848385 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" podUID="3f4c2998-b51a-4620-b674-60bb0817eb7d" Mar 10 09:20:11 crc kubenswrapper[4883]: E0310 09:20:11.848431 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" podUID="d0e08342-2d1b-42d9-921e-1d948f701a58" Mar 10 09:20:11 crc kubenswrapper[4883]: E0310 09:20:11.848559 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" podUID="c13f33e2-dd6a-4ca0-91e7-5489c753e273" Mar 10 09:20:12 crc kubenswrapper[4883]: I0310 09:20:12.040926 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.041101 4883 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.041192 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert podName:c994e4ad-140c-4655-ad69-e4013406d12e nodeName:}" failed. No retries permitted until 2026-03-10 09:20:16.041170432 +0000 UTC m=+1002.296068320 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert") pod "infra-operator-controller-manager-5995f4446f-v6p2d" (UID: "c994e4ad-140c-4655-ad69-e4013406d12e") : secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: I0310 09:20:12.248610 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.248796 4883 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.249098 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert podName:2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f nodeName:}" failed. No retries permitted until 2026-03-10 09:20:16.249075193 +0000 UTC m=+1002.503973082 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f9f2px" (UID: "2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: I0310 09:20:12.653912 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:12 crc kubenswrapper[4883]: I0310 09:20:12.653975 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.654268 4883 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.654315 4883 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.654367 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:16.654347627 +0000 UTC m=+1002.909245515 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "metrics-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.654412 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:16.65439203 +0000 UTC m=+1002.909289918 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "webhook-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: I0310 09:20:16.112609 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.112933 4883 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.113548 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert podName:c994e4ad-140c-4655-ad69-e4013406d12e nodeName:}" failed. No retries permitted until 2026-03-10 09:20:24.113523186 +0000 UTC m=+1010.368421064 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert") pod "infra-operator-controller-manager-5995f4446f-v6p2d" (UID: "c994e4ad-140c-4655-ad69-e4013406d12e") : secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: I0310 09:20:16.317104 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.317351 4883 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.317440 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert podName:2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f nodeName:}" failed. No retries permitted until 2026-03-10 09:20:24.317409269 +0000 UTC m=+1010.572307159 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f9f2px" (UID: "2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: I0310 09:20:16.723729 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:16 crc kubenswrapper[4883]: I0310 09:20:16.723807 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.723921 4883 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.724002 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:24.723980813 +0000 UTC m=+1010.978878702 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "metrics-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.724086 4883 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.724171 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:24.724135013 +0000 UTC m=+1010.979032892 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "webhook-server-cert" not found Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.448582 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.448815 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.448858 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.449262 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b763722fafe0669f89812126b4b9a1c6fdb578c65fcf3519f299d35b8f8264e"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.449301 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://7b763722fafe0669f89812126b4b9a1c6fdb578c65fcf3519f299d35b8f8264e" gracePeriod=600 Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.913771 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="7b763722fafe0669f89812126b4b9a1c6fdb578c65fcf3519f299d35b8f8264e" exitCode=0 Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.913814 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"7b763722fafe0669f89812126b4b9a1c6fdb578c65fcf3519f299d35b8f8264e"} Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.913845 4883 scope.go:117] "RemoveContainer" containerID="263ff00fc2241dc1e88d637d22f488b48ffd25d14a523d99a3849ab10808063c" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.922652 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" event={"ID":"d3d3c04d-7e05-4df2-85c6-394d0bde1a69","Type":"ContainerStarted","Data":"7e23f10ff2ef9bcc37e8bd3393d601e3c367e0ac7cff2e0a201fea76c5c58a18"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.923362 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.924794 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" event={"ID":"8a4cb5eb-0894-440e-8cfd-448651696a6f","Type":"ContainerStarted","Data":"e7a91213ba34c8c1a78394a0fc3403d4f60c5f586f936b099529943421d6cad5"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.924890 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.926928 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" event={"ID":"884f7bcb-08ef-49f3-912b-ca921e342615","Type":"ContainerStarted","Data":"1cc064a3ba89d1674b9dc0011507e630993bd607b597a4606a644ee2dcbdecc6"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.927084 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.928738 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" event={"ID":"bf027c79-6bdb-4cfb-8c31-d785b80e2231","Type":"ContainerStarted","Data":"dc98f8b9ed49b5a1d62726c263856a3bade2c5ba6e98e23146118deed6031412"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.928794 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.930419 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" event={"ID":"09a04267-a914-4c55-add8-735a053038d3","Type":"ContainerStarted","Data":"111e96d855f304b51a0130b8400e37afdcb6ad41be6828cc22d88f5462faa519"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.930514 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.933441 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"baa5421b80377ba5964153032e8442f67866ecaee882a6a6d6389530ab32d1e3"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.934835 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" event={"ID":"760c8dff-c64a-492b-a778-45ef16d197bd","Type":"ContainerStarted","Data":"a9b77e4ec829f7f1da64203e5c1487f1dc19a319bd5c353083d660f457a53249"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.934952 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.936567 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" event={"ID":"9a394c48-31ca-4e99-b210-45ae6f67faaa","Type":"ContainerStarted","Data":"768f33138eb2722dccdf0f5396f1112ba1711d3e4a647933f731a1bba199bb8d"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.936661 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.938211 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" event={"ID":"63474f68-d09d-4822-b650-96a37aead592","Type":"ContainerStarted","Data":"a2b30f0071070c883067425c34be93fc6509f16a62dac19c638343e99715fa77"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.938363 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.942372 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" podStartSLOduration=2.207306025 podStartE2EDuration="10.942355855s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.78124318 +0000 UTC m=+996.036141069" lastFinishedPulling="2026-03-10 09:20:18.516293009 +0000 UTC m=+1004.771190899" observedRunningTime="2026-03-10 09:20:18.93806871 +0000 UTC m=+1005.192966599" watchObservedRunningTime="2026-03-10 09:20:18.942355855 +0000 UTC m=+1005.197253744" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.945541 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" event={"ID":"ac18771f-5f45-40d8-b275-38e2e1c48ba6","Type":"ContainerStarted","Data":"7dda4daa1fb1649747b31147546d2ad5ec9a2d80673842518fe392ff7c43f7d2"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.945954 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.947390 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" event={"ID":"04b3aecb-7cfd-4042-b003-4bc8c339aff8","Type":"ContainerStarted","Data":"343a950e4152184cdf00aae8d1d145bbc817dce45e0c5e74877e6bc1749e49b9"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.947517 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.948813 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" event={"ID":"ad93994a-26d2-4353-80be-456c1311020e","Type":"ContainerStarted","Data":"99ad760dad0f846d00a375655e727e54d33e58c5cadcefcbaa5ae6d3bfebbec0"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.948971 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.961139 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" podStartSLOduration=1.9989325409999998 podStartE2EDuration="10.961121709s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.503218843 +0000 UTC m=+995.758116732" lastFinishedPulling="2026-03-10 09:20:18.465408011 +0000 UTC m=+1004.720305900" observedRunningTime="2026-03-10 09:20:18.959661596 +0000 UTC m=+1005.214559484" watchObservedRunningTime="2026-03-10 09:20:18.961121709 +0000 UTC m=+1005.216019588" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.982319 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" podStartSLOduration=1.8990351159999999 podStartE2EDuration="10.982299752s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.436925111 +0000 UTC m=+995.691823000" lastFinishedPulling="2026-03-10 09:20:18.520189747 +0000 UTC m=+1004.775087636" observedRunningTime="2026-03-10 09:20:18.981351343 +0000 UTC m=+1005.236249233" watchObservedRunningTime="2026-03-10 09:20:18.982299752 +0000 UTC m=+1005.237197641" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.030515 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" podStartSLOduration=2.112578469 podStartE2EDuration="11.030500218s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.602456003 +0000 UTC m=+995.857353891" lastFinishedPulling="2026-03-10 09:20:18.520377751 +0000 UTC m=+1004.775275640" observedRunningTime="2026-03-10 09:20:19.02878308 +0000 UTC m=+1005.283680959" watchObservedRunningTime="2026-03-10 09:20:19.030500218 +0000 UTC m=+1005.285398107" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.084508 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" podStartSLOduration=2.08306965 podStartE2EDuration="11.084487285s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.514965883 +0000 UTC m=+995.769863773" lastFinishedPulling="2026-03-10 09:20:18.516383519 +0000 UTC m=+1004.771281408" observedRunningTime="2026-03-10 09:20:19.060952928 +0000 UTC m=+1005.315850817" watchObservedRunningTime="2026-03-10 09:20:19.084487285 +0000 UTC m=+1005.339385173" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.099631 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" podStartSLOduration=2.181918445 podStartE2EDuration="11.099595993s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.603187251 +0000 UTC m=+995.858085141" lastFinishedPulling="2026-03-10 09:20:18.52086481 +0000 UTC m=+1004.775762689" observedRunningTime="2026-03-10 09:20:19.099210826 +0000 UTC m=+1005.354108716" watchObservedRunningTime="2026-03-10 09:20:19.099595993 +0000 UTC m=+1005.354493882" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.144942 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" podStartSLOduration=2.220559876 podStartE2EDuration="11.14492291s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.591362946 +0000 UTC m=+995.846260835" lastFinishedPulling="2026-03-10 09:20:18.51572598 +0000 UTC m=+1004.770623869" observedRunningTime="2026-03-10 09:20:19.142320423 +0000 UTC m=+1005.397218313" watchObservedRunningTime="2026-03-10 09:20:19.14492291 +0000 UTC m=+1005.399820799" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.224839 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" podStartSLOduration=2.146921735 podStartE2EDuration="11.224810825s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.435531032 +0000 UTC m=+995.690428921" lastFinishedPulling="2026-03-10 09:20:18.513420122 +0000 UTC m=+1004.768318011" observedRunningTime="2026-03-10 09:20:19.22113794 +0000 UTC m=+1005.476035829" watchObservedRunningTime="2026-03-10 09:20:19.224810825 +0000 UTC m=+1005.479708714" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.295967 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" podStartSLOduration=2.363514839 podStartE2EDuration="11.295946377s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.588448641 +0000 UTC m=+995.843346530" lastFinishedPulling="2026-03-10 09:20:18.520880178 +0000 UTC m=+1004.775778068" observedRunningTime="2026-03-10 09:20:19.295280742 +0000 UTC m=+1005.550178631" watchObservedRunningTime="2026-03-10 09:20:19.295946377 +0000 UTC m=+1005.550844267" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.359332 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" podStartSLOduration=2.462614961 podStartE2EDuration="11.359315264s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.585963925 +0000 UTC m=+995.840861815" lastFinishedPulling="2026-03-10 09:20:18.482664239 +0000 UTC m=+1004.737562118" observedRunningTime="2026-03-10 09:20:19.355023683 +0000 UTC m=+1005.609921571" watchObservedRunningTime="2026-03-10 09:20:19.359315264 +0000 UTC m=+1005.614213154" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.391585 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" podStartSLOduration=2.289595797 podStartE2EDuration="11.391568339s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.441873251 +0000 UTC m=+995.696771139" lastFinishedPulling="2026-03-10 09:20:18.543845791 +0000 UTC m=+1004.798743681" observedRunningTime="2026-03-10 09:20:19.38758028 +0000 UTC m=+1005.642478169" watchObservedRunningTime="2026-03-10 09:20:19.391568339 +0000 UTC m=+1005.646466229" Mar 10 09:20:24 crc kubenswrapper[4883]: I0310 09:20:24.135629 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.135716 4883 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.136341 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert podName:c994e4ad-140c-4655-ad69-e4013406d12e nodeName:}" failed. No retries permitted until 2026-03-10 09:20:40.136318563 +0000 UTC m=+1026.391216452 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert") pod "infra-operator-controller-manager-5995f4446f-v6p2d" (UID: "c994e4ad-140c-4655-ad69-e4013406d12e") : secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: I0310 09:20:24.339576 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.339833 4883 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.339947 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert podName:2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f nodeName:}" failed. No retries permitted until 2026-03-10 09:20:40.339925881 +0000 UTC m=+1026.594823770 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f9f2px" (UID: "2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: I0310 09:20:24.747011 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:24 crc kubenswrapper[4883]: I0310 09:20:24.747306 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.747218 4883 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.747532 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:40.747517517 +0000 UTC m=+1027.002415406 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "metrics-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.747465 4883 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.747714 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:40.747689862 +0000 UTC m=+1027.002587750 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "webhook-server-cert" not found Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.424972 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.437827 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.439786 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.591229 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.608521 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.771328 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.799822 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.813091 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.817669 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.879288 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.944433 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.029921 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" event={"ID":"8b177c77-d85f-4374-b6db-a700719c1282","Type":"ContainerStarted","Data":"2b22b77f345de2ec28206c76d238c648870af6a448d35083baa444304148a8de"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.030430 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.033051 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" event={"ID":"475c1190-6d94-431a-943d-4e749ea87d6b","Type":"ContainerStarted","Data":"da557b399e4261a9114cb2eb0f95fabbaed65c94b06d9fced6cd9a82ebc3bf15"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.034919 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" event={"ID":"1b429bd6-00de-4cc2-8a18-9f58897b6834","Type":"ContainerStarted","Data":"0faabe56de5bb1f2b247e7d90820445910d2a35d205010b8244c27c233669ee3"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.035078 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.036259 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" event={"ID":"3f4c2998-b51a-4620-b674-60bb0817eb7d","Type":"ContainerStarted","Data":"39c59132c7a776ad64e1758acc76c10b7a6c76a512d094556ee28d5932c7ca7b"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.036468 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.037816 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" event={"ID":"91415f40-08a2-451b-abe8-38c7b447e66f","Type":"ContainerStarted","Data":"f401e6cdb7f542da8d4084cf2814b23a1ee7684a0ebd7b034112835f6dc2e47d"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.037991 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.039443 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" event={"ID":"ec624ec4-966f-410c-95c7-73be0f9cad27","Type":"ContainerStarted","Data":"d3cd4289e8a33c51a2b626dff092b53667b2ae2fa77c8ee6e9a28239738665cf"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.039657 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.041193 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" event={"ID":"a7216675-a296-4faa-9dd5-d857b15ffa3c","Type":"ContainerStarted","Data":"582f2959b40ee58a20fdefb5b62254d9b58c00ad09eb1a6157268c8ab23b7988"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.041377 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.042749 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" podStartSLOduration=2.959986144 podStartE2EDuration="22.042730791s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.623028764 +0000 UTC m=+995.877926654" lastFinishedPulling="2026-03-10 09:20:28.705773412 +0000 UTC m=+1014.960671301" observedRunningTime="2026-03-10 09:20:30.040925968 +0000 UTC m=+1016.295823858" watchObservedRunningTime="2026-03-10 09:20:30.042730791 +0000 UTC m=+1016.297628680" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.053768 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" podStartSLOduration=3.553471607 podStartE2EDuration="22.053752323s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.627308985 +0000 UTC m=+995.882206874" lastFinishedPulling="2026-03-10 09:20:28.127589701 +0000 UTC m=+1014.382487590" observedRunningTime="2026-03-10 09:20:30.051833064 +0000 UTC m=+1016.306730954" watchObservedRunningTime="2026-03-10 09:20:30.053752323 +0000 UTC m=+1016.308650212" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.064696 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" podStartSLOduration=2.590838335 podStartE2EDuration="22.064682402s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.789275826 +0000 UTC m=+996.044173715" lastFinishedPulling="2026-03-10 09:20:29.263119893 +0000 UTC m=+1015.518017782" observedRunningTime="2026-03-10 09:20:30.061115576 +0000 UTC m=+1016.316013466" watchObservedRunningTime="2026-03-10 09:20:30.064682402 +0000 UTC m=+1016.319580292" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.081235 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" podStartSLOduration=2.3919529219999998 podStartE2EDuration="22.081175021s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.611264071 +0000 UTC m=+995.866161960" lastFinishedPulling="2026-03-10 09:20:29.30048617 +0000 UTC m=+1015.555384059" observedRunningTime="2026-03-10 09:20:30.078051161 +0000 UTC m=+1016.332949049" watchObservedRunningTime="2026-03-10 09:20:30.081175021 +0000 UTC m=+1016.336072910" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.097112 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" podStartSLOduration=2.614934079 podStartE2EDuration="22.09710094s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.818350128 +0000 UTC m=+996.073248017" lastFinishedPulling="2026-03-10 09:20:29.300516989 +0000 UTC m=+1015.555414878" observedRunningTime="2026-03-10 09:20:30.093509188 +0000 UTC m=+1016.348407076" watchObservedRunningTime="2026-03-10 09:20:30.09710094 +0000 UTC m=+1016.351998829" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.134621 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" podStartSLOduration=3.234709236 podStartE2EDuration="22.134601842s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.806020199 +0000 UTC m=+996.060918088" lastFinishedPulling="2026-03-10 09:20:28.705912805 +0000 UTC m=+1014.960810694" observedRunningTime="2026-03-10 09:20:30.133465539 +0000 UTC m=+1016.388363428" watchObservedRunningTime="2026-03-10 09:20:30.134601842 +0000 UTC m=+1016.389499731" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.135131 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" podStartSLOduration=2.6768084979999998 podStartE2EDuration="22.13512627s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.807041725 +0000 UTC m=+996.061939614" lastFinishedPulling="2026-03-10 09:20:29.265359496 +0000 UTC m=+1015.520257386" observedRunningTime="2026-03-10 09:20:30.117053072 +0000 UTC m=+1016.371950961" watchObservedRunningTime="2026-03-10 09:20:30.13512627 +0000 UTC m=+1016.390024160" Mar 10 09:20:34 crc kubenswrapper[4883]: I0310 09:20:34.589182 4883 scope.go:117] "RemoveContainer" containerID="3e7da8f0c03e771b080917bc83392de1ddb5243f6ec147ddb91205eab0cfd88f" Mar 10 09:20:38 crc kubenswrapper[4883]: I0310 09:20:38.688215 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:38 crc kubenswrapper[4883]: I0310 09:20:38.690439 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:38 crc kubenswrapper[4883]: I0310 09:20:38.737907 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:38 crc kubenswrapper[4883]: I0310 09:20:38.902332 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:38 crc kubenswrapper[4883]: I0310 09:20:38.923073 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.123538 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" event={"ID":"d0e08342-2d1b-42d9-921e-1d948f701a58","Type":"ContainerStarted","Data":"a026ed0e71110320753d42e030185f6f7a0fdc8887cc65f0a34dde9e777bf6de"} Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.123747 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.125691 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" event={"ID":"c13f33e2-dd6a-4ca0-91e7-5489c753e273","Type":"ContainerStarted","Data":"a113883fa844ed32d6c3d17f9729f9be9ecd130d7b96f7c819adafd130b87ad9"} Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.125886 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.205118 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.229150 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" podStartSLOduration=2.833061728 podStartE2EDuration="31.22913217s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.789338194 +0000 UTC m=+996.044236073" lastFinishedPulling="2026-03-10 09:20:38.185408626 +0000 UTC m=+1024.440306515" observedRunningTime="2026-03-10 09:20:39.138354661 +0000 UTC m=+1025.393252550" watchObservedRunningTime="2026-03-10 09:20:39.22913217 +0000 UTC m=+1025.484030058" Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.230050 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" podStartSLOduration=2.85096265 podStartE2EDuration="31.230043437s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.803198368 +0000 UTC m=+996.058096257" lastFinishedPulling="2026-03-10 09:20:38.182279145 +0000 UTC m=+1024.437177044" observedRunningTime="2026-03-10 09:20:39.224844745 +0000 UTC m=+1025.479742635" watchObservedRunningTime="2026-03-10 09:20:39.230043437 +0000 UTC m=+1025.484941327" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.178755 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.185604 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.362053 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4ld57" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.370805 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.380731 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.385209 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.668319 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cggl5" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.676847 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.774451 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d"] Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.788635 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.788689 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.795520 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.795921 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.873805 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px"] Mar 10 09:20:40 crc kubenswrapper[4883]: W0310 09:20:40.878050 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a2580ec_7e99_4eb0_95e2_9e6ca33a6a5f.slice/crio-bb77f866bd704e415da5ad71f185c149c2da85a03c08e79218770975bdc6b666 WatchSource:0}: Error finding container bb77f866bd704e415da5ad71f185c149c2da85a03c08e79218770975bdc6b666: Status 404 returned error can't find the container with id bb77f866bd704e415da5ad71f185c149c2da85a03c08e79218770975bdc6b666 Mar 10 09:20:41 crc kubenswrapper[4883]: I0310 09:20:41.010275 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rpzgd" Mar 10 09:20:41 crc kubenswrapper[4883]: I0310 09:20:41.018615 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:41 crc kubenswrapper[4883]: I0310 09:20:41.145808 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" event={"ID":"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f","Type":"ContainerStarted","Data":"bb77f866bd704e415da5ad71f185c149c2da85a03c08e79218770975bdc6b666"} Mar 10 09:20:41 crc kubenswrapper[4883]: I0310 09:20:41.147805 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" event={"ID":"c994e4ad-140c-4655-ad69-e4013406d12e","Type":"ContainerStarted","Data":"e6b6002ce0e0bc69c96acf006c9fd4d004a12051d206f6b2293c00069f834e1f"} Mar 10 09:20:41 crc kubenswrapper[4883]: I0310 09:20:41.412098 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4"] Mar 10 09:20:41 crc kubenswrapper[4883]: W0310 09:20:41.416064 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod969b2d39_fb99_42df_8e6e_3ded5cd292c8.slice/crio-4ab786620574c94df8159751f4aebe381decf94fa2054c0bcc2e0632eeba008b WatchSource:0}: Error finding container 4ab786620574c94df8159751f4aebe381decf94fa2054c0bcc2e0632eeba008b: Status 404 returned error can't find the container with id 4ab786620574c94df8159751f4aebe381decf94fa2054c0bcc2e0632eeba008b Mar 10 09:20:42 crc kubenswrapper[4883]: I0310 09:20:42.162674 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" event={"ID":"969b2d39-fb99-42df-8e6e-3ded5cd292c8","Type":"ContainerStarted","Data":"159f95dc45d61cdf552b5808596cfacbb5b9148438d095c3d9091b58f2bea9b0"} Mar 10 09:20:42 crc kubenswrapper[4883]: I0310 09:20:42.162728 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" event={"ID":"969b2d39-fb99-42df-8e6e-3ded5cd292c8","Type":"ContainerStarted","Data":"4ab786620574c94df8159751f4aebe381decf94fa2054c0bcc2e0632eeba008b"} Mar 10 09:20:42 crc kubenswrapper[4883]: I0310 09:20:42.162752 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:42 crc kubenswrapper[4883]: I0310 09:20:42.195340 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" podStartSLOduration=34.195325652 podStartE2EDuration="34.195325652s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:20:42.184973522 +0000 UTC m=+1028.439871411" watchObservedRunningTime="2026-03-10 09:20:42.195325652 +0000 UTC m=+1028.450223542" Mar 10 09:20:45 crc kubenswrapper[4883]: I0310 09:20:45.192532 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" event={"ID":"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f","Type":"ContainerStarted","Data":"aabf5d387c6030614d20fd997892a9b068b90ddfeb567493b08ff36ea990cd9d"} Mar 10 09:20:45 crc kubenswrapper[4883]: I0310 09:20:45.193097 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:45 crc kubenswrapper[4883]: I0310 09:20:45.194793 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" event={"ID":"c994e4ad-140c-4655-ad69-e4013406d12e","Type":"ContainerStarted","Data":"08ed32918672484b4be49c63cf650cd0b50a264e2e1f43db593f8b55c6995c80"} Mar 10 09:20:45 crc kubenswrapper[4883]: I0310 09:20:45.194875 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:45 crc kubenswrapper[4883]: I0310 09:20:45.214087 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" podStartSLOduration=34.025012163 podStartE2EDuration="37.214076747s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:40.880380629 +0000 UTC m=+1027.135278519" lastFinishedPulling="2026-03-10 09:20:44.069445223 +0000 UTC m=+1030.324343103" observedRunningTime="2026-03-10 09:20:45.213573508 +0000 UTC m=+1031.468471396" watchObservedRunningTime="2026-03-10 09:20:45.214076747 +0000 UTC m=+1031.468974636" Mar 10 09:20:45 crc kubenswrapper[4883]: I0310 09:20:45.227491 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" podStartSLOduration=33.935478211 podStartE2EDuration="37.227448681s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:40.78197639 +0000 UTC m=+1027.036874280" lastFinishedPulling="2026-03-10 09:20:44.073946861 +0000 UTC m=+1030.328844750" observedRunningTime="2026-03-10 09:20:45.226527584 +0000 UTC m=+1031.481425473" watchObservedRunningTime="2026-03-10 09:20:45.227448681 +0000 UTC m=+1031.482346570" Mar 10 09:20:48 crc kubenswrapper[4883]: I0310 09:20:48.805036 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:48 crc kubenswrapper[4883]: I0310 09:20:48.844575 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:50 crc kubenswrapper[4883]: I0310 09:20:50.377175 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:50 crc kubenswrapper[4883]: I0310 09:20:50.685350 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:51 crc kubenswrapper[4883]: I0310 09:20:51.026283 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.446869 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-56s6s"] Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.448609 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.453987 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.454140 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.454233 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.454323 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-zrbkq" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.459490 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-56s6s"] Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.495359 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-454p7"] Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.496701 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.498063 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.510430 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-454p7"] Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.540725 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljglh\" (UniqueName: \"kubernetes.io/projected/b1f1ef1a-4311-492d-b626-484f3b8ae836-kube-api-access-ljglh\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.540802 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-config\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.540884 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96nm8\" (UniqueName: \"kubernetes.io/projected/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-kube-api-access-96nm8\") pod \"dnsmasq-dns-589db6c89c-56s6s\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.541079 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.541143 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-config\") pod \"dnsmasq-dns-589db6c89c-56s6s\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.642562 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljglh\" (UniqueName: \"kubernetes.io/projected/b1f1ef1a-4311-492d-b626-484f3b8ae836-kube-api-access-ljglh\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.642623 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-config\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.642706 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96nm8\" (UniqueName: \"kubernetes.io/projected/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-kube-api-access-96nm8\") pod \"dnsmasq-dns-589db6c89c-56s6s\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.642804 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.643892 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-config\") pod \"dnsmasq-dns-589db6c89c-56s6s\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.643814 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-config\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.643816 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.644728 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-config\") pod \"dnsmasq-dns-589db6c89c-56s6s\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.661886 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljglh\" (UniqueName: \"kubernetes.io/projected/b1f1ef1a-4311-492d-b626-484f3b8ae836-kube-api-access-ljglh\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.661945 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96nm8\" (UniqueName: \"kubernetes.io/projected/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-kube-api-access-96nm8\") pod \"dnsmasq-dns-589db6c89c-56s6s\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.766109 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.809088 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:06 crc kubenswrapper[4883]: I0310 09:21:06.153615 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-56s6s"] Mar 10 09:21:06 crc kubenswrapper[4883]: I0310 09:21:06.195672 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-454p7"] Mar 10 09:21:06 crc kubenswrapper[4883]: I0310 09:21:06.358334 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" event={"ID":"f3a815ae-f56c-4ad8-a4cd-b202012bf94a","Type":"ContainerStarted","Data":"83fd7b7c14ede20be946470ce0f3534de6b73ac0b442f2cc761c12340562dfa2"} Mar 10 09:21:06 crc kubenswrapper[4883]: I0310 09:21:06.360141 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" event={"ID":"b1f1ef1a-4311-492d-b626-484f3b8ae836","Type":"ContainerStarted","Data":"b4c4edf66ac859048fbaea168e4fb72023b5242b17e67fe3301372d7bb2750e3"} Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.193205 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-56s6s"] Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.217291 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wcttq"] Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.218428 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.235509 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wcttq"] Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.289388 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.289563 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-config\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.289707 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh9gd\" (UniqueName: \"kubernetes.io/projected/8010fd0c-6a0f-4078-851d-aff31b9efa90-kube-api-access-zh9gd\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.391231 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh9gd\" (UniqueName: \"kubernetes.io/projected/8010fd0c-6a0f-4078-851d-aff31b9efa90-kube-api-access-zh9gd\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.391315 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.391354 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-config\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.392184 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-config\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.392867 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.417267 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh9gd\" (UniqueName: \"kubernetes.io/projected/8010fd0c-6a0f-4078-851d-aff31b9efa90-kube-api-access-zh9gd\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.449911 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-454p7"] Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.472656 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-ckpnm"] Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.491889 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.526487 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-ckpnm"] Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.553504 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.596069 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.596176 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-config\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.596219 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cksz8\" (UniqueName: \"kubernetes.io/projected/6b114327-1a63-488a-aace-0488259b1278-kube-api-access-cksz8\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.698316 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.698455 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-config\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.698520 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cksz8\" (UniqueName: \"kubernetes.io/projected/6b114327-1a63-488a-aace-0488259b1278-kube-api-access-cksz8\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.699831 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-config\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.700923 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.715342 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cksz8\" (UniqueName: \"kubernetes.io/projected/6b114327-1a63-488a-aace-0488259b1278-kube-api-access-cksz8\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.823099 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.095863 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wcttq"] Mar 10 09:21:09 crc kubenswrapper[4883]: W0310 09:21:09.107068 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8010fd0c_6a0f_4078_851d_aff31b9efa90.slice/crio-27f91884d19752af35b1407d9296c7623b17d38cd07f5ebd1cd011bdaef8ae8d WatchSource:0}: Error finding container 27f91884d19752af35b1407d9296c7623b17d38cd07f5ebd1cd011bdaef8ae8d: Status 404 returned error can't find the container with id 27f91884d19752af35b1407d9296c7623b17d38cd07f5ebd1cd011bdaef8ae8d Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.232987 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-ckpnm"] Mar 10 09:21:09 crc kubenswrapper[4883]: W0310 09:21:09.239533 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b114327_1a63_488a_aace_0488259b1278.slice/crio-0f7fbcddfd5a9e8e2b01f02ad16e0ed221a09ead1d4f04beb704150072ccc53c WatchSource:0}: Error finding container 0f7fbcddfd5a9e8e2b01f02ad16e0ed221a09ead1d4f04beb704150072ccc53c: Status 404 returned error can't find the container with id 0f7fbcddfd5a9e8e2b01f02ad16e0ed221a09ead1d4f04beb704150072ccc53c Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.365444 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.366897 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.370525 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cjf6k" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.370622 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.370537 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.370739 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.370879 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.371650 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.371707 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.379495 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.397519 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" event={"ID":"8010fd0c-6a0f-4078-851d-aff31b9efa90","Type":"ContainerStarted","Data":"27f91884d19752af35b1407d9296c7623b17d38cd07f5ebd1cd011bdaef8ae8d"} Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.400661 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" event={"ID":"6b114327-1a63-488a-aace-0488259b1278","Type":"ContainerStarted","Data":"0f7fbcddfd5a9e8e2b01f02ad16e0ed221a09ead1d4f04beb704150072ccc53c"} Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.409755 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.409848 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.409882 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.409935 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.409960 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnprs\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-kube-api-access-dnprs\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.410066 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.410111 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.410230 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.410285 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.410334 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.410425 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.511546 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.512704 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513333 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513275 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513616 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513755 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnprs\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-kube-api-access-dnprs\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.512645 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513813 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513897 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.514034 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.514082 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.514122 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.514196 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.515098 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.515394 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.516147 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.520901 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.521111 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.521752 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.524437 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.527540 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnprs\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-kube-api-access-dnprs\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.535495 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.632643 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.634168 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.636920 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.637213 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.637219 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.637357 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-x4lhh" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.637489 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.637602 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.637698 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.644202 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.696929 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.820783 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821103 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821153 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdb6ba72-d1c8-4022-9029-2e18784e1139-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821174 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821216 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821239 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh4vm\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-kube-api-access-nh4vm\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821270 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-config-data\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821304 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821323 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdb6ba72-d1c8-4022-9029-2e18784e1139-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821345 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821379 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923026 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdb6ba72-d1c8-4022-9029-2e18784e1139-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923085 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923135 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923163 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh4vm\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-kube-api-access-nh4vm\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923211 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-config-data\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923235 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923257 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdb6ba72-d1c8-4022-9029-2e18784e1139-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923275 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923294 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923417 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923438 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.924203 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.925035 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.925453 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.926595 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.926635 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.927375 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-config-data\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.930006 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdb6ba72-d1c8-4022-9029-2e18784e1139-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.933619 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.938935 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdb6ba72-d1c8-4022-9029-2e18784e1139-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.939613 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.942092 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh4vm\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-kube-api-access-nh4vm\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.944621 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.972137 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 09:21:10 crc kubenswrapper[4883]: W0310 09:21:10.204141 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aa2bcd6_6a54_472f_bd1c_276e6f8caa07.slice/crio-010dc7b31b375a13abbbdc15bf6ef1b807f020341c6f04245119794152aa7af6 WatchSource:0}: Error finding container 010dc7b31b375a13abbbdc15bf6ef1b807f020341c6f04245119794152aa7af6: Status 404 returned error can't find the container with id 010dc7b31b375a13abbbdc15bf6ef1b807f020341c6f04245119794152aa7af6 Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.206688 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.389614 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:21:10 crc kubenswrapper[4883]: W0310 09:21:10.404058 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdb6ba72_d1c8_4022_9029_2e18784e1139.slice/crio-531195218a3dcad57b9a11db1f5738cf463317c7f3a59428a1b0f404415ec848 WatchSource:0}: Error finding container 531195218a3dcad57b9a11db1f5738cf463317c7f3a59428a1b0f404415ec848: Status 404 returned error can't find the container with id 531195218a3dcad57b9a11db1f5738cf463317c7f3a59428a1b0f404415ec848 Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.438817 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdb6ba72-d1c8-4022-9029-2e18784e1139","Type":"ContainerStarted","Data":"531195218a3dcad57b9a11db1f5738cf463317c7f3a59428a1b0f404415ec848"} Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.441065 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07","Type":"ContainerStarted","Data":"010dc7b31b375a13abbbdc15bf6ef1b807f020341c6f04245119794152aa7af6"} Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.757421 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.758866 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.760240 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.762443 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.764262 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-hmvxb" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.769225 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.776762 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.784730 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.942839 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-config-data-default\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.942900 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5dae6834-0ed6-4043-9efe-91745925591a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.942975 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh5ms\" (UniqueName: \"kubernetes.io/projected/5dae6834-0ed6-4043-9efe-91745925591a-kube-api-access-gh5ms\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.943043 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-kolla-config\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.943089 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.943260 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dae6834-0ed6-4043-9efe-91745925591a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.943349 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.943379 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dae6834-0ed6-4043-9efe-91745925591a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.044756 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dae6834-0ed6-4043-9efe-91745925591a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.044858 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.044887 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dae6834-0ed6-4043-9efe-91745925591a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.044913 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-config-data-default\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.044946 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5dae6834-0ed6-4043-9efe-91745925591a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.045014 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh5ms\" (UniqueName: \"kubernetes.io/projected/5dae6834-0ed6-4043-9efe-91745925591a-kube-api-access-gh5ms\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.045040 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-kolla-config\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.045066 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.045347 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.046015 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5dae6834-0ed6-4043-9efe-91745925591a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.047705 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-config-data-default\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.047877 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-kolla-config\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.048570 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.052469 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dae6834-0ed6-4043-9efe-91745925591a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.061056 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh5ms\" (UniqueName: \"kubernetes.io/projected/5dae6834-0ed6-4043-9efe-91745925591a-kube-api-access-gh5ms\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.063113 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dae6834-0ed6-4043-9efe-91745925591a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.065741 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.079968 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.577683 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.246784 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.250318 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.252945 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.252966 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.252970 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-66v9q" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.258063 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.258429 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365245 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xng5m\" (UniqueName: \"kubernetes.io/projected/287f174d-514a-4c8c-a70e-b6e64fe41653-kube-api-access-xng5m\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365311 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/287f174d-514a-4c8c-a70e-b6e64fe41653-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365451 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365510 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365550 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365762 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287f174d-514a-4c8c-a70e-b6e64fe41653-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365814 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/287f174d-514a-4c8c-a70e-b6e64fe41653-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365862 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.468704 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.468906 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xng5m\" (UniqueName: \"kubernetes.io/projected/287f174d-514a-4c8c-a70e-b6e64fe41653-kube-api-access-xng5m\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.468943 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/287f174d-514a-4c8c-a70e-b6e64fe41653-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.469023 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.469046 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.469078 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.469142 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.469201 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287f174d-514a-4c8c-a70e-b6e64fe41653-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.469249 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/287f174d-514a-4c8c-a70e-b6e64fe41653-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.471780 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.472137 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.472744 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.472761 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/287f174d-514a-4c8c-a70e-b6e64fe41653-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.491054 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/287f174d-514a-4c8c-a70e-b6e64fe41653-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.491095 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287f174d-514a-4c8c-a70e-b6e64fe41653-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.499583 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.508212 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xng5m\" (UniqueName: \"kubernetes.io/projected/287f174d-514a-4c8c-a70e-b6e64fe41653-kube-api-access-xng5m\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.577714 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.654669 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.656768 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.664286 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-7vzv4" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.664707 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.664921 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.685412 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.774212 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52bdcacc-ce19-418b-871c-35482038da29-kolla-config\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.774373 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp89f\" (UniqueName: \"kubernetes.io/projected/52bdcacc-ce19-418b-871c-35482038da29-kube-api-access-cp89f\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.774551 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52bdcacc-ce19-418b-871c-35482038da29-config-data\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.774680 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bdcacc-ce19-418b-871c-35482038da29-memcached-tls-certs\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.774794 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bdcacc-ce19-418b-871c-35482038da29-combined-ca-bundle\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.877163 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52bdcacc-ce19-418b-871c-35482038da29-config-data\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.877265 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bdcacc-ce19-418b-871c-35482038da29-memcached-tls-certs\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.877336 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bdcacc-ce19-418b-871c-35482038da29-combined-ca-bundle\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.877393 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52bdcacc-ce19-418b-871c-35482038da29-kolla-config\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.877426 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp89f\" (UniqueName: \"kubernetes.io/projected/52bdcacc-ce19-418b-871c-35482038da29-kube-api-access-cp89f\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.878898 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52bdcacc-ce19-418b-871c-35482038da29-config-data\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.880288 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52bdcacc-ce19-418b-871c-35482038da29-kolla-config\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.889764 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bdcacc-ce19-418b-871c-35482038da29-memcached-tls-certs\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.889782 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bdcacc-ce19-418b-871c-35482038da29-combined-ca-bundle\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.902253 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp89f\" (UniqueName: \"kubernetes.io/projected/52bdcacc-ce19-418b-871c-35482038da29-kube-api-access-cp89f\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.984155 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 09:21:14 crc kubenswrapper[4883]: I0310 09:21:14.883175 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:21:14 crc kubenswrapper[4883]: I0310 09:21:14.884697 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 09:21:14 crc kubenswrapper[4883]: I0310 09:21:14.887248 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8zs8c" Mar 10 09:21:14 crc kubenswrapper[4883]: I0310 09:21:14.893826 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:21:15 crc kubenswrapper[4883]: I0310 09:21:15.037600 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k86gr\" (UniqueName: \"kubernetes.io/projected/5094e588-6ef7-4214-a96e-26d75ad98977-kube-api-access-k86gr\") pod \"kube-state-metrics-0\" (UID: \"5094e588-6ef7-4214-a96e-26d75ad98977\") " pod="openstack/kube-state-metrics-0" Mar 10 09:21:15 crc kubenswrapper[4883]: I0310 09:21:15.139930 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k86gr\" (UniqueName: \"kubernetes.io/projected/5094e588-6ef7-4214-a96e-26d75ad98977-kube-api-access-k86gr\") pod \"kube-state-metrics-0\" (UID: \"5094e588-6ef7-4214-a96e-26d75ad98977\") " pod="openstack/kube-state-metrics-0" Mar 10 09:21:15 crc kubenswrapper[4883]: I0310 09:21:15.156368 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k86gr\" (UniqueName: \"kubernetes.io/projected/5094e588-6ef7-4214-a96e-26d75ad98977-kube-api-access-k86gr\") pod \"kube-state-metrics-0\" (UID: \"5094e588-6ef7-4214-a96e-26d75ad98977\") " pod="openstack/kube-state-metrics-0" Mar 10 09:21:15 crc kubenswrapper[4883]: I0310 09:21:15.201089 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 09:21:15 crc kubenswrapper[4883]: I0310 09:21:15.549656 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5dae6834-0ed6-4043-9efe-91745925591a","Type":"ContainerStarted","Data":"95ee7666b24ae09d3b2e9bf6236b7e8d99bea51fde83eeeb876963a2df97ba11"} Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.453489 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.456949 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.461302 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.461360 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.461465 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.461869 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.462057 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-jnn7z" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.465148 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.476113 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lb2z9"] Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.477350 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.479126 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.479829 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rp9zb" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.482656 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.495798 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-qrl4s"] Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.497641 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.504459 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lb2z9"] Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.516709 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qrl4s"] Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601609 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601692 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28145780-82a1-453f-be56-b22c635f027e-scripts\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601728 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg6r6\" (UniqueName: \"kubernetes.io/projected/6691939e-adb0-420c-bf9e-f4a9b670c83b-kube-api-access-bg6r6\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601804 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601822 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-etc-ovs\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601847 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6691939e-adb0-420c-bf9e-f4a9b670c83b-combined-ca-bundle\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601869 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-run\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601896 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6691939e-adb0-420c-bf9e-f4a9b670c83b-ovn-controller-tls-certs\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601980 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602018 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-run-ovn\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602048 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602070 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-log-ovn\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602090 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602113 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-config\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602141 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99k66\" (UniqueName: \"kubernetes.io/projected/28145780-82a1-453f-be56-b22c635f027e-kube-api-access-99k66\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602165 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602186 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-lib\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602226 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j27tl\" (UniqueName: \"kubernetes.io/projected/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-kube-api-access-j27tl\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602248 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6691939e-adb0-420c-bf9e-f4a9b670c83b-scripts\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602268 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-log\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602287 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-run\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.705361 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j27tl\" (UniqueName: \"kubernetes.io/projected/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-kube-api-access-j27tl\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.705410 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6691939e-adb0-420c-bf9e-f4a9b670c83b-scripts\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.705442 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-log\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.705461 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-run\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.705990 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-run\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706098 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706140 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28145780-82a1-453f-be56-b22c635f027e-scripts\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706170 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg6r6\" (UniqueName: \"kubernetes.io/projected/6691939e-adb0-420c-bf9e-f4a9b670c83b-kube-api-access-bg6r6\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706235 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706256 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-etc-ovs\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706283 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6691939e-adb0-420c-bf9e-f4a9b670c83b-combined-ca-bundle\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706304 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-run\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706326 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6691939e-adb0-420c-bf9e-f4a9b670c83b-ovn-controller-tls-certs\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706389 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706413 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-run-ovn\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706450 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706466 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-log-ovn\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706503 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706523 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-config\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706551 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99k66\" (UniqueName: \"kubernetes.io/projected/28145780-82a1-453f-be56-b22c635f027e-kube-api-access-99k66\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706573 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706972 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.707571 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-lib\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.707748 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.709615 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-run\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.709833 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28145780-82a1-453f-be56-b22c635f027e-scripts\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.710031 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-log\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.710105 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6691939e-adb0-420c-bf9e-f4a9b670c83b-scripts\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.710262 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-etc-ovs\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.711139 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.712399 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-lib\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.713372 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-log-ovn\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.715264 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.718837 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-config\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.721809 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-run-ovn\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.728610 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.728673 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.729632 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6691939e-adb0-420c-bf9e-f4a9b670c83b-combined-ca-bundle\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.732328 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6691939e-adb0-420c-bf9e-f4a9b670c83b-ovn-controller-tls-certs\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.735185 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j27tl\" (UniqueName: \"kubernetes.io/projected/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-kube-api-access-j27tl\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.736295 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99k66\" (UniqueName: \"kubernetes.io/projected/28145780-82a1-453f-be56-b22c635f027e-kube-api-access-99k66\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.742707 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg6r6\" (UniqueName: \"kubernetes.io/projected/6691939e-adb0-420c-bf9e-f4a9b670c83b-kube-api-access-bg6r6\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.759987 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.810265 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.817602 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.825808 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.224377 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.230521 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.234147 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.234728 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.234764 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.234873 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2swnw" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.256724 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.389866 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.389913 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249a9bf5-ef0f-4209-855e-3fa422106519-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.389947 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.389990 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.390009 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249a9bf5-ef0f-4209-855e-3fa422106519-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.390028 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q4tk\" (UniqueName: \"kubernetes.io/projected/249a9bf5-ef0f-4209-855e-3fa422106519-kube-api-access-9q4tk\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.390057 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.390087 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249a9bf5-ef0f-4209-855e-3fa422106519-config\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493267 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493352 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249a9bf5-ef0f-4209-855e-3fa422106519-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493416 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493564 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493599 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249a9bf5-ef0f-4209-855e-3fa422106519-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493626 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q4tk\" (UniqueName: \"kubernetes.io/projected/249a9bf5-ef0f-4209-855e-3fa422106519-kube-api-access-9q4tk\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493686 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493777 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249a9bf5-ef0f-4209-855e-3fa422106519-config\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.495188 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249a9bf5-ef0f-4209-855e-3fa422106519-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.495351 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249a9bf5-ef0f-4209-855e-3fa422106519-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.495520 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249a9bf5-ef0f-4209-855e-3fa422106519-config\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.495665 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.502198 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.503076 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.511290 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q4tk\" (UniqueName: \"kubernetes.io/projected/249a9bf5-ef0f-4209-855e-3fa422106519-kube-api-access-9q4tk\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.511952 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.516228 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.566830 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: E0310 09:21:22.731959 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 10 09:21:22 crc kubenswrapper[4883]: E0310 09:21:22.732260 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ljglh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-454p7_openstack(b1f1ef1a-4311-492d-b626-484f3b8ae836): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:21:22 crc kubenswrapper[4883]: E0310 09:21:22.733490 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" podUID="b1f1ef1a-4311-492d-b626-484f3b8ae836" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.848007 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.864330 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-config\") pod \"b1f1ef1a-4311-492d-b626-484f3b8ae836\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.864634 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljglh\" (UniqueName: \"kubernetes.io/projected/b1f1ef1a-4311-492d-b626-484f3b8ae836-kube-api-access-ljglh\") pod \"b1f1ef1a-4311-492d-b626-484f3b8ae836\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.864783 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-dns-svc\") pod \"b1f1ef1a-4311-492d-b626-484f3b8ae836\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.866562 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-config" (OuterVolumeSpecName: "config") pod "b1f1ef1a-4311-492d-b626-484f3b8ae836" (UID: "b1f1ef1a-4311-492d-b626-484f3b8ae836"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.866856 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1f1ef1a-4311-492d-b626-484f3b8ae836" (UID: "b1f1ef1a-4311-492d-b626-484f3b8ae836"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.874230 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f1ef1a-4311-492d-b626-484f3b8ae836-kube-api-access-ljglh" (OuterVolumeSpecName: "kube-api-access-ljglh") pod "b1f1ef1a-4311-492d-b626-484f3b8ae836" (UID: "b1f1ef1a-4311-492d-b626-484f3b8ae836"). InnerVolumeSpecName "kube-api-access-ljglh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.970129 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.970165 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.970180 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljglh\" (UniqueName: \"kubernetes.io/projected/b1f1ef1a-4311-492d-b626-484f3b8ae836-kube-api-access-ljglh\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.237040 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.346663 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lb2z9"] Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.360601 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.367058 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.454355 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 09:21:26 crc kubenswrapper[4883]: W0310 09:21:26.470410 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5094e588_6ef7_4214_a96e_26d75ad98977.slice/crio-8cea65d53e3ef54f53f5028aee0f66a936c83a86f17bc4112a8c38018507f5cd WatchSource:0}: Error finding container 8cea65d53e3ef54f53f5028aee0f66a936c83a86f17bc4112a8c38018507f5cd: Status 404 returned error can't find the container with id 8cea65d53e3ef54f53f5028aee0f66a936c83a86f17bc4112a8c38018507f5cd Mar 10 09:21:26 crc kubenswrapper[4883]: W0310 09:21:26.472507 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod287f174d_514a_4c8c_a70e_b6e64fe41653.slice/crio-cf77515b6a8e23e44ad6eba001ccc79a5abfefcb2792c111c292972a2cf06ea7 WatchSource:0}: Error finding container cf77515b6a8e23e44ad6eba001ccc79a5abfefcb2792c111c292972a2cf06ea7: Status 404 returned error can't find the container with id cf77515b6a8e23e44ad6eba001ccc79a5abfefcb2792c111c292972a2cf06ea7 Mar 10 09:21:26 crc kubenswrapper[4883]: W0310 09:21:26.474441 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe383ddb_b33d_4129_acf8_1ffbbc21b1d4.slice/crio-21b6096c13b9eb852df5fe497b5e2eb1956343d65d21a4daf84a5660fbacbb10 WatchSource:0}: Error finding container 21b6096c13b9eb852df5fe497b5e2eb1956343d65d21a4daf84a5660fbacbb10: Status 404 returned error can't find the container with id 21b6096c13b9eb852df5fe497b5e2eb1956343d65d21a4daf84a5660fbacbb10 Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.536240 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 09:21:26 crc kubenswrapper[4883]: W0310 09:21:26.540088 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod249a9bf5_ef0f_4209_855e_3fa422106519.slice/crio-62849294d0207d516f219fbf4ddbd471c56ac1787264d9d45691377b27b19375 WatchSource:0}: Error finding container 62849294d0207d516f219fbf4ddbd471c56ac1787264d9d45691377b27b19375: Status 404 returned error can't find the container with id 62849294d0207d516f219fbf4ddbd471c56ac1787264d9d45691377b27b19375 Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.654689 4883 generic.go:334] "Generic (PLEG): container finished" podID="6b114327-1a63-488a-aace-0488259b1278" containerID="0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a" exitCode=0 Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.654803 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" event={"ID":"6b114327-1a63-488a-aace-0488259b1278","Type":"ContainerDied","Data":"0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.660740 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"be383ddb-b33d-4129-acf8-1ffbbc21b1d4","Type":"ContainerStarted","Data":"21b6096c13b9eb852df5fe497b5e2eb1956343d65d21a4daf84a5660fbacbb10"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.665256 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5dae6834-0ed6-4043-9efe-91745925591a","Type":"ContainerStarted","Data":"a3b53efd4d291b31d42168aec7ece633e3c99bad85e93af9f7b973b657782982"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.671183 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"249a9bf5-ef0f-4209-855e-3fa422106519","Type":"ContainerStarted","Data":"62849294d0207d516f219fbf4ddbd471c56ac1787264d9d45691377b27b19375"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.678614 4883 generic.go:334] "Generic (PLEG): container finished" podID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerID="a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd" exitCode=0 Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.678867 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" event={"ID":"8010fd0c-6a0f-4078-851d-aff31b9efa90","Type":"ContainerDied","Data":"a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.682486 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"287f174d-514a-4c8c-a70e-b6e64fe41653","Type":"ContainerStarted","Data":"cf77515b6a8e23e44ad6eba001ccc79a5abfefcb2792c111c292972a2cf06ea7"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.684108 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lb2z9" event={"ID":"6691939e-adb0-420c-bf9e-f4a9b670c83b","Type":"ContainerStarted","Data":"4840599676e1d0ad84cee1bd1bb0b41f3c4212049cf840499f26fa38bb5474a4"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.685515 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"52bdcacc-ce19-418b-871c-35482038da29","Type":"ContainerStarted","Data":"3e7f41d3bfe364c680b02181ceb7d66bd5258c6231ec38db4d490c2f4cfc10fc"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.689289 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5094e588-6ef7-4214-a96e-26d75ad98977","Type":"ContainerStarted","Data":"8cea65d53e3ef54f53f5028aee0f66a936c83a86f17bc4112a8c38018507f5cd"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.700941 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.700925 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" event={"ID":"b1f1ef1a-4311-492d-b626-484f3b8ae836","Type":"ContainerDied","Data":"b4c4edf66ac859048fbaea168e4fb72023b5242b17e67fe3301372d7bb2750e3"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.702713 4883 generic.go:334] "Generic (PLEG): container finished" podID="f3a815ae-f56c-4ad8-a4cd-b202012bf94a" containerID="e00ff4ed38f4346987a643fce8da97d41c7e1cf34fa07996eca6ab319b7d076d" exitCode=0 Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.702824 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" event={"ID":"f3a815ae-f56c-4ad8-a4cd-b202012bf94a","Type":"ContainerDied","Data":"e00ff4ed38f4346987a643fce8da97d41c7e1cf34fa07996eca6ab319b7d076d"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.760516 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-454p7"] Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.762348 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-454p7"] Mar 10 09:21:26 crc kubenswrapper[4883]: E0310 09:21:26.942339 4883 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 10 09:21:26 crc kubenswrapper[4883]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/8010fd0c-6a0f-4078-851d-aff31b9efa90/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 10 09:21:26 crc kubenswrapper[4883]: > podSandboxID="27f91884d19752af35b1407d9296c7623b17d38cd07f5ebd1cd011bdaef8ae8d" Mar 10 09:21:26 crc kubenswrapper[4883]: E0310 09:21:26.942617 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:21:26 crc kubenswrapper[4883]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zh9gd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78cb4465c9-wcttq_openstack(8010fd0c-6a0f-4078-851d-aff31b9efa90): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/8010fd0c-6a0f-4078-851d-aff31b9efa90/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 10 09:21:26 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:21:26 crc kubenswrapper[4883]: E0310 09:21:26.943928 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/8010fd0c-6a0f-4078-851d-aff31b9efa90/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.105234 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.258803 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qrl4s"] Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.292930 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96nm8\" (UniqueName: \"kubernetes.io/projected/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-kube-api-access-96nm8\") pod \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.292984 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-config\") pod \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.300426 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-kube-api-access-96nm8" (OuterVolumeSpecName: "kube-api-access-96nm8") pod "f3a815ae-f56c-4ad8-a4cd-b202012bf94a" (UID: "f3a815ae-f56c-4ad8-a4cd-b202012bf94a"). InnerVolumeSpecName "kube-api-access-96nm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.318165 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-config" (OuterVolumeSpecName: "config") pod "f3a815ae-f56c-4ad8-a4cd-b202012bf94a" (UID: "f3a815ae-f56c-4ad8-a4cd-b202012bf94a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.396052 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.396099 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96nm8\" (UniqueName: \"kubernetes.io/projected/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-kube-api-access-96nm8\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.715345 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.715510 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" event={"ID":"f3a815ae-f56c-4ad8-a4cd-b202012bf94a","Type":"ContainerDied","Data":"83fd7b7c14ede20be946470ce0f3534de6b73ac0b442f2cc761c12340562dfa2"} Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.715829 4883 scope.go:117] "RemoveContainer" containerID="e00ff4ed38f4346987a643fce8da97d41c7e1cf34fa07996eca6ab319b7d076d" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.718439 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07","Type":"ContainerStarted","Data":"8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5"} Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.721004 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"287f174d-514a-4c8c-a70e-b6e64fe41653","Type":"ContainerStarted","Data":"e9d6f667a090f843d59fd481c9c48cbf57a1b7366eea84e3bc032122f390cd65"} Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.725724 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" event={"ID":"6b114327-1a63-488a-aace-0488259b1278","Type":"ContainerStarted","Data":"4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc"} Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.725830 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.727675 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdb6ba72-d1c8-4022-9029-2e18784e1139","Type":"ContainerStarted","Data":"cd676384987aa2244a851651d9f3c8c0df5750259fc99e409d5e9d090a18495f"} Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.729973 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qrl4s" event={"ID":"28145780-82a1-453f-be56-b22c635f027e","Type":"ContainerStarted","Data":"85b1226f3c138a389d93578b856a79c83ff2666be61efa138303092bb74abdff"} Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.785533 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" podStartSLOduration=3.123866549 podStartE2EDuration="19.785514705s" podCreationTimestamp="2026-03-10 09:21:08 +0000 UTC" firstStartedPulling="2026-03-10 09:21:09.241689685 +0000 UTC m=+1055.496587564" lastFinishedPulling="2026-03-10 09:21:25.90333783 +0000 UTC m=+1072.158235720" observedRunningTime="2026-03-10 09:21:27.779778799 +0000 UTC m=+1074.034676689" watchObservedRunningTime="2026-03-10 09:21:27.785514705 +0000 UTC m=+1074.040412594" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.840061 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-56s6s"] Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.845717 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-56s6s"] Mar 10 09:21:28 crc kubenswrapper[4883]: I0310 09:21:28.096385 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f1ef1a-4311-492d-b626-484f3b8ae836" path="/var/lib/kubelet/pods/b1f1ef1a-4311-492d-b626-484f3b8ae836/volumes" Mar 10 09:21:28 crc kubenswrapper[4883]: I0310 09:21:28.096871 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a815ae-f56c-4ad8-a4cd-b202012bf94a" path="/var/lib/kubelet/pods/f3a815ae-f56c-4ad8-a4cd-b202012bf94a/volumes" Mar 10 09:21:29 crc kubenswrapper[4883]: I0310 09:21:29.750452 4883 generic.go:334] "Generic (PLEG): container finished" podID="5dae6834-0ed6-4043-9efe-91745925591a" containerID="a3b53efd4d291b31d42168aec7ece633e3c99bad85e93af9f7b973b657782982" exitCode=0 Mar 10 09:21:29 crc kubenswrapper[4883]: I0310 09:21:29.750525 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5dae6834-0ed6-4043-9efe-91745925591a","Type":"ContainerDied","Data":"a3b53efd4d291b31d42168aec7ece633e3c99bad85e93af9f7b973b657782982"} Mar 10 09:21:30 crc kubenswrapper[4883]: I0310 09:21:30.760043 4883 generic.go:334] "Generic (PLEG): container finished" podID="287f174d-514a-4c8c-a70e-b6e64fe41653" containerID="e9d6f667a090f843d59fd481c9c48cbf57a1b7366eea84e3bc032122f390cd65" exitCode=0 Mar 10 09:21:30 crc kubenswrapper[4883]: I0310 09:21:30.760104 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"287f174d-514a-4c8c-a70e-b6e64fe41653","Type":"ContainerDied","Data":"e9d6f667a090f843d59fd481c9c48cbf57a1b7366eea84e3bc032122f390cd65"} Mar 10 09:21:33 crc kubenswrapper[4883]: I0310 09:21:33.825573 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:33 crc kubenswrapper[4883]: I0310 09:21:33.869971 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wcttq"] Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.799911 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"287f174d-514a-4c8c-a70e-b6e64fe41653","Type":"ContainerStarted","Data":"757d30cd9b3c92fad097c14d32721d80969bb9767347c47720ed6d97b22675e4"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.804099 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"249a9bf5-ef0f-4209-855e-3fa422106519","Type":"ContainerStarted","Data":"8d9ad6373ae411b2ac0f01369623f779183f9c96c307b2b12b9353ccddafc6fe"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.806836 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lb2z9" event={"ID":"6691939e-adb0-420c-bf9e-f4a9b670c83b","Type":"ContainerStarted","Data":"0c7f3445ef1622d3d2fec3d8e8194d8e369d0132dbeadfb718f36b61be9f6b4e"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.807318 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.809179 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"be383ddb-b33d-4129-acf8-1ffbbc21b1d4","Type":"ContainerStarted","Data":"54303e3827b4f36fd621f8f6ec606578a5f34bd241b06b9411ec5d61c45035a7"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.811910 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qrl4s" event={"ID":"28145780-82a1-453f-be56-b22c635f027e","Type":"ContainerStarted","Data":"ff7519323bfb04da3dfae0be6b11fba29616584669ab746b419f63a7bc1b5efc"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.814249 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5dae6834-0ed6-4043-9efe-91745925591a","Type":"ContainerStarted","Data":"d58644c8d73a34a611347de240574a01b69baf93c0a1009a92d6ebb3b29ef3f4"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.817737 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5094e588-6ef7-4214-a96e-26d75ad98977","Type":"ContainerStarted","Data":"c0101a4659f1590e10ca756263b45be2c4afc16ddc5690114bf2e3350ef9322b"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.817889 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.823380 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" event={"ID":"8010fd0c-6a0f-4078-851d-aff31b9efa90","Type":"ContainerStarted","Data":"768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.823784 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerName="dnsmasq-dns" containerID="cri-o://768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85" gracePeriod=10 Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.824507 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.825506 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=23.825463676 podStartE2EDuration="23.825463676s" podCreationTimestamp="2026-03-10 09:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:21:34.822688384 +0000 UTC m=+1081.077586272" watchObservedRunningTime="2026-03-10 09:21:34.825463676 +0000 UTC m=+1081.080361566" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.831169 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"52bdcacc-ce19-418b-871c-35482038da29","Type":"ContainerStarted","Data":"81e78ce75df4af1031f82d7356f973031ac1435a97eb2cebb5c3769550d451b6"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.832202 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.844198 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lb2z9" podStartSLOduration=9.067883535 podStartE2EDuration="16.844176481s" podCreationTimestamp="2026-03-10 09:21:18 +0000 UTC" firstStartedPulling="2026-03-10 09:21:26.36142475 +0000 UTC m=+1072.616322640" lastFinishedPulling="2026-03-10 09:21:34.137717697 +0000 UTC m=+1080.392615586" observedRunningTime="2026-03-10 09:21:34.836045339 +0000 UTC m=+1081.090943229" watchObservedRunningTime="2026-03-10 09:21:34.844176481 +0000 UTC m=+1081.099074370" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.849663 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.204528349 podStartE2EDuration="20.849646455s" podCreationTimestamp="2026-03-10 09:21:14 +0000 UTC" firstStartedPulling="2026-03-10 09:21:26.473058445 +0000 UTC m=+1072.727956334" lastFinishedPulling="2026-03-10 09:21:34.118176551 +0000 UTC m=+1080.373074440" observedRunningTime="2026-03-10 09:21:34.847865227 +0000 UTC m=+1081.102763116" watchObservedRunningTime="2026-03-10 09:21:34.849646455 +0000 UTC m=+1081.104544344" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.871919 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=15.177894914 podStartE2EDuration="25.87190174s" podCreationTimestamp="2026-03-10 09:21:09 +0000 UTC" firstStartedPulling="2026-03-10 09:21:15.124622922 +0000 UTC m=+1061.379520810" lastFinishedPulling="2026-03-10 09:21:25.818629747 +0000 UTC m=+1072.073527636" observedRunningTime="2026-03-10 09:21:34.865698653 +0000 UTC m=+1081.120596542" watchObservedRunningTime="2026-03-10 09:21:34.87190174 +0000 UTC m=+1081.126799628" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.895092 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" podStartSLOduration=10.209000871 podStartE2EDuration="26.895070696s" podCreationTimestamp="2026-03-10 09:21:08 +0000 UTC" firstStartedPulling="2026-03-10 09:21:09.110310048 +0000 UTC m=+1055.365207937" lastFinishedPulling="2026-03-10 09:21:25.796379873 +0000 UTC m=+1072.051277762" observedRunningTime="2026-03-10 09:21:34.891681055 +0000 UTC m=+1081.146578944" watchObservedRunningTime="2026-03-10 09:21:34.895070696 +0000 UTC m=+1081.149968585" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.910216 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.083856854 podStartE2EDuration="22.910200876s" podCreationTimestamp="2026-03-10 09:21:12 +0000 UTC" firstStartedPulling="2026-03-10 09:21:26.290952059 +0000 UTC m=+1072.545849948" lastFinishedPulling="2026-03-10 09:21:34.117296081 +0000 UTC m=+1080.372193970" observedRunningTime="2026-03-10 09:21:34.907428708 +0000 UTC m=+1081.162326597" watchObservedRunningTime="2026-03-10 09:21:34.910200876 +0000 UTC m=+1081.165098764" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.208533 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.348223 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-dns-svc\") pod \"8010fd0c-6a0f-4078-851d-aff31b9efa90\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.348321 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh9gd\" (UniqueName: \"kubernetes.io/projected/8010fd0c-6a0f-4078-851d-aff31b9efa90-kube-api-access-zh9gd\") pod \"8010fd0c-6a0f-4078-851d-aff31b9efa90\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.348374 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-config\") pod \"8010fd0c-6a0f-4078-851d-aff31b9efa90\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.355098 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8010fd0c-6a0f-4078-851d-aff31b9efa90-kube-api-access-zh9gd" (OuterVolumeSpecName: "kube-api-access-zh9gd") pod "8010fd0c-6a0f-4078-851d-aff31b9efa90" (UID: "8010fd0c-6a0f-4078-851d-aff31b9efa90"). InnerVolumeSpecName "kube-api-access-zh9gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.392181 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8010fd0c-6a0f-4078-851d-aff31b9efa90" (UID: "8010fd0c-6a0f-4078-851d-aff31b9efa90"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.403714 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-config" (OuterVolumeSpecName: "config") pod "8010fd0c-6a0f-4078-851d-aff31b9efa90" (UID: "8010fd0c-6a0f-4078-851d-aff31b9efa90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.450679 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh9gd\" (UniqueName: \"kubernetes.io/projected/8010fd0c-6a0f-4078-851d-aff31b9efa90-kube-api-access-zh9gd\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.450714 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.450725 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.847105 4883 generic.go:334] "Generic (PLEG): container finished" podID="28145780-82a1-453f-be56-b22c635f027e" containerID="ff7519323bfb04da3dfae0be6b11fba29616584669ab746b419f63a7bc1b5efc" exitCode=0 Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.847219 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qrl4s" event={"ID":"28145780-82a1-453f-be56-b22c635f027e","Type":"ContainerDied","Data":"ff7519323bfb04da3dfae0be6b11fba29616584669ab746b419f63a7bc1b5efc"} Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.847444 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qrl4s" event={"ID":"28145780-82a1-453f-be56-b22c635f027e","Type":"ContainerStarted","Data":"eb6a1dadbf21b091ec9e4aec4a2d239facf3a09a6ef6e5a7052eb11f03a359b8"} Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.847458 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qrl4s" event={"ID":"28145780-82a1-453f-be56-b22c635f027e","Type":"ContainerStarted","Data":"6a52ddd2e9b0d4babaaa81937e6d0f741f9dc8aaa1166943b17dd4fe2b997126"} Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.847489 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.847517 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.856342 4883 generic.go:334] "Generic (PLEG): container finished" podID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerID="768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85" exitCode=0 Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.856390 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.856437 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" event={"ID":"8010fd0c-6a0f-4078-851d-aff31b9efa90","Type":"ContainerDied","Data":"768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85"} Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.856486 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" event={"ID":"8010fd0c-6a0f-4078-851d-aff31b9efa90","Type":"ContainerDied","Data":"27f91884d19752af35b1407d9296c7623b17d38cd07f5ebd1cd011bdaef8ae8d"} Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.856516 4883 scope.go:117] "RemoveContainer" containerID="768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.871576 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-qrl4s" podStartSLOduration=11.161336447 podStartE2EDuration="17.871563571s" podCreationTimestamp="2026-03-10 09:21:18 +0000 UTC" firstStartedPulling="2026-03-10 09:21:27.426238882 +0000 UTC m=+1073.681136772" lastFinishedPulling="2026-03-10 09:21:34.136466006 +0000 UTC m=+1080.391363896" observedRunningTime="2026-03-10 09:21:35.865749258 +0000 UTC m=+1082.120647147" watchObservedRunningTime="2026-03-10 09:21:35.871563571 +0000 UTC m=+1082.126461460" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.890201 4883 scope.go:117] "RemoveContainer" containerID="a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.893612 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wcttq"] Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.901788 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wcttq"] Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.916293 4883 scope.go:117] "RemoveContainer" containerID="768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85" Mar 10 09:21:35 crc kubenswrapper[4883]: E0310 09:21:35.916704 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85\": container with ID starting with 768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85 not found: ID does not exist" containerID="768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.916728 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85"} err="failed to get container status \"768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85\": rpc error: code = NotFound desc = could not find container \"768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85\": container with ID starting with 768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85 not found: ID does not exist" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.916747 4883 scope.go:117] "RemoveContainer" containerID="a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd" Mar 10 09:21:35 crc kubenswrapper[4883]: E0310 09:21:35.917074 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd\": container with ID starting with a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd not found: ID does not exist" containerID="a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.917119 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd"} err="failed to get container status \"a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd\": rpc error: code = NotFound desc = could not find container \"a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd\": container with ID starting with a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd not found: ID does not exist" Mar 10 09:21:36 crc kubenswrapper[4883]: I0310 09:21:36.089024 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" path="/var/lib/kubelet/pods/8010fd0c-6a0f-4078-851d-aff31b9efa90/volumes" Mar 10 09:21:38 crc kubenswrapper[4883]: I0310 09:21:38.886964 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"249a9bf5-ef0f-4209-855e-3fa422106519","Type":"ContainerStarted","Data":"149b82326646a72af1f60664f1c7e944ac1d163d90fea10e539ffa7351dfffd1"} Mar 10 09:21:38 crc kubenswrapper[4883]: I0310 09:21:38.889174 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"be383ddb-b33d-4129-acf8-1ffbbc21b1d4","Type":"ContainerStarted","Data":"427ce480cdf58a25d715353338cd5927df39a666005d830c3d5b50ef1cdcab10"} Mar 10 09:21:38 crc kubenswrapper[4883]: I0310 09:21:38.910870 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.619463932 podStartE2EDuration="17.910853563s" podCreationTimestamp="2026-03-10 09:21:21 +0000 UTC" firstStartedPulling="2026-03-10 09:21:26.541646493 +0000 UTC m=+1072.796544382" lastFinishedPulling="2026-03-10 09:21:37.833036123 +0000 UTC m=+1084.087934013" observedRunningTime="2026-03-10 09:21:38.90550025 +0000 UTC m=+1085.160398138" watchObservedRunningTime="2026-03-10 09:21:38.910853563 +0000 UTC m=+1085.165751453" Mar 10 09:21:38 crc kubenswrapper[4883]: I0310 09:21:38.925951 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.5844231 podStartE2EDuration="21.925936433s" podCreationTimestamp="2026-03-10 09:21:17 +0000 UTC" firstStartedPulling="2026-03-10 09:21:26.476411097 +0000 UTC m=+1072.731308985" lastFinishedPulling="2026-03-10 09:21:37.817924429 +0000 UTC m=+1084.072822318" observedRunningTime="2026-03-10 09:21:38.921005566 +0000 UTC m=+1085.175903454" watchObservedRunningTime="2026-03-10 09:21:38.925936433 +0000 UTC m=+1085.180834323" Mar 10 09:21:39 crc kubenswrapper[4883]: I0310 09:21:39.811464 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:39 crc kubenswrapper[4883]: I0310 09:21:39.843174 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:39 crc kubenswrapper[4883]: I0310 09:21:39.894720 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:39 crc kubenswrapper[4883]: I0310 09:21:39.924331 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.145206 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-nxp9v"] Mar 10 09:21:40 crc kubenswrapper[4883]: E0310 09:21:40.145647 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a815ae-f56c-4ad8-a4cd-b202012bf94a" containerName="init" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.145668 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a815ae-f56c-4ad8-a4cd-b202012bf94a" containerName="init" Mar 10 09:21:40 crc kubenswrapper[4883]: E0310 09:21:40.145715 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerName="init" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.145722 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerName="init" Mar 10 09:21:40 crc kubenswrapper[4883]: E0310 09:21:40.145741 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerName="dnsmasq-dns" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.145747 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerName="dnsmasq-dns" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.145936 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerName="dnsmasq-dns" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.145959 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a815ae-f56c-4ad8-a4cd-b202012bf94a" containerName="init" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.146842 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.149387 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.154003 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-nxp9v"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.177941 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-b2z2p"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.179038 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.185735 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.193737 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-b2z2p"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271028 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-ovs-rundir\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271100 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271142 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlj5h\" (UniqueName: \"kubernetes.io/projected/eb2ce411-30fc-481a-bfe4-a73537462f13-kube-api-access-jlj5h\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271157 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-combined-ca-bundle\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271178 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvnns\" (UniqueName: \"kubernetes.io/projected/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-kube-api-access-gvnns\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271428 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271500 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-config\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271691 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-ovn-rundir\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271832 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-config\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271903 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374501 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-ovn-rundir\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374671 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-config\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374739 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374778 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-ovs-rundir\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374824 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374845 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-ovn-rundir\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374873 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlj5h\" (UniqueName: \"kubernetes.io/projected/eb2ce411-30fc-481a-bfe4-a73537462f13-kube-api-access-jlj5h\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374894 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-combined-ca-bundle\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374920 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvnns\" (UniqueName: \"kubernetes.io/projected/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-kube-api-access-gvnns\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.375183 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.375184 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-ovs-rundir\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.375208 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-config\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.376155 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-config\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.376195 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.376335 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-config\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.376390 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.386218 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.386329 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-combined-ca-bundle\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.390605 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlj5h\" (UniqueName: \"kubernetes.io/projected/eb2ce411-30fc-481a-bfe4-a73537462f13-kube-api-access-jlj5h\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.393640 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvnns\" (UniqueName: \"kubernetes.io/projected/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-kube-api-access-gvnns\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.463052 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.495267 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.568174 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.584939 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-nxp9v"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.593436 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-qq85c"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.600664 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.603385 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.603824 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-qq85c"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.631128 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.679873 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-config\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.680214 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.680263 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.680327 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqx5f\" (UniqueName: \"kubernetes.io/projected/34eb524a-8ba3-4157-8a0c-efd069843d47-kube-api-access-cqx5f\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.680372 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.782149 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-config\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.782281 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.782323 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.782368 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqx5f\" (UniqueName: \"kubernetes.io/projected/34eb524a-8ba3-4157-8a0c-efd069843d47-kube-api-access-cqx5f\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.782396 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.783155 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-config\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.783305 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.783561 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.783630 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.800410 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqx5f\" (UniqueName: \"kubernetes.io/projected/34eb524a-8ba3-4157-8a0c-efd069843d47-kube-api-access-cqx5f\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.904542 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.921229 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-nxp9v"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.932538 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: W0310 09:21:40.936276 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb2ce411_30fc_481a_bfe4_a73537462f13.slice/crio-b9ce573ba7666deddaf4bbfea815e9be85feaa65aa156c0b1b45685812d7e264 WatchSource:0}: Error finding container b9ce573ba7666deddaf4bbfea815e9be85feaa65aa156c0b1b45685812d7e264: Status 404 returned error can't find the container with id b9ce573ba7666deddaf4bbfea815e9be85feaa65aa156c0b1b45685812d7e264 Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.939694 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.009755 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-b2z2p"] Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.074132 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.075591 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.080260 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.080284 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.086122 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.094093 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-znz9b" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.094266 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.094395 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.096232 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.173889 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.191399 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgxrq\" (UniqueName: \"kubernetes.io/projected/b47099e9-f945-4873-a704-ee55b0f0ac46-kube-api-access-zgxrq\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.191468 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b47099e9-f945-4873-a704-ee55b0f0ac46-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.191594 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b47099e9-f945-4873-a704-ee55b0f0ac46-scripts\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.191893 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.191932 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.191959 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b47099e9-f945-4873-a704-ee55b0f0ac46-config\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.192114 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294148 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294418 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294443 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b47099e9-f945-4873-a704-ee55b0f0ac46-config\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294531 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294566 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgxrq\" (UniqueName: \"kubernetes.io/projected/b47099e9-f945-4873-a704-ee55b0f0ac46-kube-api-access-zgxrq\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294586 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b47099e9-f945-4873-a704-ee55b0f0ac46-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294625 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b47099e9-f945-4873-a704-ee55b0f0ac46-scripts\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.295645 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b47099e9-f945-4873-a704-ee55b0f0ac46-config\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.295751 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b47099e9-f945-4873-a704-ee55b0f0ac46-scripts\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.296310 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b47099e9-f945-4873-a704-ee55b0f0ac46-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.300894 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.301605 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.301825 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.310369 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgxrq\" (UniqueName: \"kubernetes.io/projected/b47099e9-f945-4873-a704-ee55b0f0ac46-kube-api-access-zgxrq\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.401119 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-qq85c"] Mar 10 09:21:41 crc kubenswrapper[4883]: W0310 09:21:41.402762 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34eb524a_8ba3_4157_8a0c_efd069843d47.slice/crio-217b807bc42e3fb9b90b1828f1947e2dec1e612d0200d8cf3f6228734fe68a4c WatchSource:0}: Error finding container 217b807bc42e3fb9b90b1828f1947e2dec1e612d0200d8cf3f6228734fe68a4c: Status 404 returned error can't find the container with id 217b807bc42e3fb9b90b1828f1947e2dec1e612d0200d8cf3f6228734fe68a4c Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.404622 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.831122 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.911348 4883 generic.go:334] "Generic (PLEG): container finished" podID="eb2ce411-30fc-481a-bfe4-a73537462f13" containerID="f83a160ef6da8cae8c43ffcc9b804f634b417db7bccd27ca82a5481ab275d14d" exitCode=0 Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.911434 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" event={"ID":"eb2ce411-30fc-481a-bfe4-a73537462f13","Type":"ContainerDied","Data":"f83a160ef6da8cae8c43ffcc9b804f634b417db7bccd27ca82a5481ab275d14d"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.912846 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" event={"ID":"eb2ce411-30fc-481a-bfe4-a73537462f13","Type":"ContainerStarted","Data":"b9ce573ba7666deddaf4bbfea815e9be85feaa65aa156c0b1b45685812d7e264"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.916657 4883 generic.go:334] "Generic (PLEG): container finished" podID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerID="ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210" exitCode=0 Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.918138 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" event={"ID":"34eb524a-8ba3-4157-8a0c-efd069843d47","Type":"ContainerDied","Data":"ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.918173 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" event={"ID":"34eb524a-8ba3-4157-8a0c-efd069843d47","Type":"ContainerStarted","Data":"217b807bc42e3fb9b90b1828f1947e2dec1e612d0200d8cf3f6228734fe68a4c"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.924597 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-b2z2p" event={"ID":"570aed6d-03dc-4ad5-b0e1-c6efc4facabb","Type":"ContainerStarted","Data":"72811aa537601afcdfe38454de17e8b1a22c3617c3e3f18a7e9d5f5d5019c053"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.924633 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-b2z2p" event={"ID":"570aed6d-03dc-4ad5-b0e1-c6efc4facabb","Type":"ContainerStarted","Data":"c9dde48ae5937ebc556ac88d14157e6ec7995d79e53bd665fb08c6c831ed287b"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.926901 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b47099e9-f945-4873-a704-ee55b0f0ac46","Type":"ContainerStarted","Data":"f1b12d1d4dcfaf0291f0b1f6afa72663645f37817f4e3027a8139ee596f05c86"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.963529 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-b2z2p" podStartSLOduration=1.963501172 podStartE2EDuration="1.963501172s" podCreationTimestamp="2026-03-10 09:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:21:41.951962526 +0000 UTC m=+1088.206860415" watchObservedRunningTime="2026-03-10 09:21:41.963501172 +0000 UTC m=+1088.218399061" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.255776 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.269444 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.325102 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc\") pod \"eb2ce411-30fc-481a-bfe4-a73537462f13\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.325246 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlj5h\" (UniqueName: \"kubernetes.io/projected/eb2ce411-30fc-481a-bfe4-a73537462f13-kube-api-access-jlj5h\") pod \"eb2ce411-30fc-481a-bfe4-a73537462f13\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.325308 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb\") pod \"eb2ce411-30fc-481a-bfe4-a73537462f13\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.325357 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-config\") pod \"eb2ce411-30fc-481a-bfe4-a73537462f13\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.331600 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb2ce411-30fc-481a-bfe4-a73537462f13-kube-api-access-jlj5h" (OuterVolumeSpecName: "kube-api-access-jlj5h") pod "eb2ce411-30fc-481a-bfe4-a73537462f13" (UID: "eb2ce411-30fc-481a-bfe4-a73537462f13"). InnerVolumeSpecName "kube-api-access-jlj5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:42 crc kubenswrapper[4883]: E0310 09:21:42.357245 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc podName:eb2ce411-30fc-481a-bfe4-a73537462f13 nodeName:}" failed. No retries permitted until 2026-03-10 09:21:42.857210813 +0000 UTC m=+1089.112108703 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc") pod "eb2ce411-30fc-481a-bfe4-a73537462f13" (UID: "eb2ce411-30fc-481a-bfe4-a73537462f13") : error deleting /var/lib/kubelet/pods/eb2ce411-30fc-481a-bfe4-a73537462f13/volume-subpaths: remove /var/lib/kubelet/pods/eb2ce411-30fc-481a-bfe4-a73537462f13/volume-subpaths: no such file or directory Mar 10 09:21:42 crc kubenswrapper[4883]: E0310 09:21:42.357281 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb podName:eb2ce411-30fc-481a-bfe4-a73537462f13 nodeName:}" failed. No retries permitted until 2026-03-10 09:21:42.85727331 +0000 UTC m=+1089.112171199 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb") pod "eb2ce411-30fc-481a-bfe4-a73537462f13" (UID: "eb2ce411-30fc-481a-bfe4-a73537462f13") : error deleting /var/lib/kubelet/pods/eb2ce411-30fc-481a-bfe4-a73537462f13/volume-subpaths: remove /var/lib/kubelet/pods/eb2ce411-30fc-481a-bfe4-a73537462f13/volume-subpaths: no such file or directory Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.357844 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-config" (OuterVolumeSpecName: "config") pod "eb2ce411-30fc-481a-bfe4-a73537462f13" (UID: "eb2ce411-30fc-481a-bfe4-a73537462f13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.428545 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlj5h\" (UniqueName: \"kubernetes.io/projected/eb2ce411-30fc-481a-bfe4-a73537462f13-kube-api-access-jlj5h\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.428588 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.578540 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.578846 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.646328 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.936619 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb\") pod \"eb2ce411-30fc-481a-bfe4-a73537462f13\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.936760 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc\") pod \"eb2ce411-30fc-481a-bfe4-a73537462f13\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.937270 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb2ce411-30fc-481a-bfe4-a73537462f13" (UID: "eb2ce411-30fc-481a-bfe4-a73537462f13"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.937351 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb2ce411-30fc-481a-bfe4-a73537462f13" (UID: "eb2ce411-30fc-481a-bfe4-a73537462f13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.937646 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.938098 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" event={"ID":"eb2ce411-30fc-481a-bfe4-a73537462f13","Type":"ContainerDied","Data":"b9ce573ba7666deddaf4bbfea815e9be85feaa65aa156c0b1b45685812d7e264"} Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.938184 4883 scope.go:117] "RemoveContainer" containerID="f83a160ef6da8cae8c43ffcc9b804f634b417db7bccd27ca82a5481ab275d14d" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.940833 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" event={"ID":"34eb524a-8ba3-4157-8a0c-efd069843d47","Type":"ContainerStarted","Data":"8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351"} Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.941672 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.964553 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" podStartSLOduration=2.964526955 podStartE2EDuration="2.964526955s" podCreationTimestamp="2026-03-10 09:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:21:42.959160527 +0000 UTC m=+1089.214058426" watchObservedRunningTime="2026-03-10 09:21:42.964526955 +0000 UTC m=+1089.219424845" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.985670 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.009992 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-nxp9v"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.015310 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-nxp9v"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.020937 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.039033 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.040259 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.226874 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-pv8r6"] Mar 10 09:21:43 crc kubenswrapper[4883]: E0310 09:21:43.227272 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2ce411-30fc-481a-bfe4-a73537462f13" containerName="init" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.227286 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2ce411-30fc-481a-bfe4-a73537462f13" containerName="init" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.227488 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2ce411-30fc-481a-bfe4-a73537462f13" containerName="init" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.228050 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.243052 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8903-account-create-update-lxrp4"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.244387 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.246260 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.248677 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8903-account-create-update-lxrp4"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.257764 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pv8r6"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.349028 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698612ed-a736-4d3d-9a0e-4c75fdd1400f-operator-scripts\") pod \"glance-8903-account-create-update-lxrp4\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.349397 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnddv\" (UniqueName: \"kubernetes.io/projected/698612ed-a736-4d3d-9a0e-4c75fdd1400f-kube-api-access-dnddv\") pod \"glance-8903-account-create-update-lxrp4\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.349443 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-operator-scripts\") pod \"glance-db-create-pv8r6\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.349492 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwvs8\" (UniqueName: \"kubernetes.io/projected/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-kube-api-access-pwvs8\") pod \"glance-db-create-pv8r6\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.450709 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnddv\" (UniqueName: \"kubernetes.io/projected/698612ed-a736-4d3d-9a0e-4c75fdd1400f-kube-api-access-dnddv\") pod \"glance-8903-account-create-update-lxrp4\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.451036 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-operator-scripts\") pod \"glance-db-create-pv8r6\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.451294 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwvs8\" (UniqueName: \"kubernetes.io/projected/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-kube-api-access-pwvs8\") pod \"glance-db-create-pv8r6\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.451939 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-operator-scripts\") pod \"glance-db-create-pv8r6\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.452119 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698612ed-a736-4d3d-9a0e-4c75fdd1400f-operator-scripts\") pod \"glance-8903-account-create-update-lxrp4\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.453074 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698612ed-a736-4d3d-9a0e-4c75fdd1400f-operator-scripts\") pod \"glance-8903-account-create-update-lxrp4\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.466218 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwvs8\" (UniqueName: \"kubernetes.io/projected/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-kube-api-access-pwvs8\") pod \"glance-db-create-pv8r6\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.468463 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnddv\" (UniqueName: \"kubernetes.io/projected/698612ed-a736-4d3d-9a0e-4c75fdd1400f-kube-api-access-dnddv\") pod \"glance-8903-account-create-update-lxrp4\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.544786 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.563701 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.957156 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b47099e9-f945-4873-a704-ee55b0f0ac46","Type":"ContainerStarted","Data":"2862e53b8919872fce8a435977928cd15d8b5473ec1ce12df00e6ab89bc3fc6b"} Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.957648 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b47099e9-f945-4873-a704-ee55b0f0ac46","Type":"ContainerStarted","Data":"785ae0bad3e887c65aef7d320008a437a53cfb493cc40ce86aa9113ab88165d4"} Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.958328 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.972356 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pv8r6"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.982264 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mj8nd"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.984275 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.991712 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mj8nd"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.992908 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.378284609 podStartE2EDuration="2.992886499s" podCreationTimestamp="2026-03-10 09:21:41 +0000 UTC" firstStartedPulling="2026-03-10 09:21:41.836034263 +0000 UTC m=+1088.090932152" lastFinishedPulling="2026-03-10 09:21:43.450636163 +0000 UTC m=+1089.705534042" observedRunningTime="2026-03-10 09:21:43.976786682 +0000 UTC m=+1090.231684570" watchObservedRunningTime="2026-03-10 09:21:43.992886499 +0000 UTC m=+1090.247784388" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.037326 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8903-account-create-update-lxrp4"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.066094 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxrzk\" (UniqueName: \"kubernetes.io/projected/258d7844-9a92-460a-a768-a5dca2fb5db9-kube-api-access-dxrzk\") pod \"keystone-db-create-mj8nd\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.066488 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258d7844-9a92-460a-a768-a5dca2fb5db9-operator-scripts\") pod \"keystone-db-create-mj8nd\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.100493 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb2ce411-30fc-481a-bfe4-a73537462f13" path="/var/lib/kubelet/pods/eb2ce411-30fc-481a-bfe4-a73537462f13/volumes" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.103925 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d500-account-create-update-fpfdr"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.105454 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.108041 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.118341 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d500-account-create-update-fpfdr"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.168718 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258d7844-9a92-460a-a768-a5dca2fb5db9-operator-scripts\") pod \"keystone-db-create-mj8nd\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.168805 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvwk5\" (UniqueName: \"kubernetes.io/projected/486b3226-21be-4783-8b29-abaf747a7693-kube-api-access-mvwk5\") pod \"keystone-d500-account-create-update-fpfdr\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.168833 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486b3226-21be-4783-8b29-abaf747a7693-operator-scripts\") pod \"keystone-d500-account-create-update-fpfdr\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.168984 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxrzk\" (UniqueName: \"kubernetes.io/projected/258d7844-9a92-460a-a768-a5dca2fb5db9-kube-api-access-dxrzk\") pod \"keystone-db-create-mj8nd\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.171376 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258d7844-9a92-460a-a768-a5dca2fb5db9-operator-scripts\") pod \"keystone-db-create-mj8nd\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.185322 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxrzk\" (UniqueName: \"kubernetes.io/projected/258d7844-9a92-460a-a768-a5dca2fb5db9-kube-api-access-dxrzk\") pod \"keystone-db-create-mj8nd\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.193223 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-j9kwf"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.194557 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.200968 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-j9kwf"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.271374 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvwk5\" (UniqueName: \"kubernetes.io/projected/486b3226-21be-4783-8b29-abaf747a7693-kube-api-access-mvwk5\") pod \"keystone-d500-account-create-update-fpfdr\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.271430 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf2pd\" (UniqueName: \"kubernetes.io/projected/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-kube-api-access-vf2pd\") pod \"placement-db-create-j9kwf\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.271467 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486b3226-21be-4783-8b29-abaf747a7693-operator-scripts\") pod \"keystone-d500-account-create-update-fpfdr\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.271860 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-operator-scripts\") pod \"placement-db-create-j9kwf\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.272405 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486b3226-21be-4783-8b29-abaf747a7693-operator-scripts\") pod \"keystone-d500-account-create-update-fpfdr\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.290283 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvwk5\" (UniqueName: \"kubernetes.io/projected/486b3226-21be-4783-8b29-abaf747a7693-kube-api-access-mvwk5\") pod \"keystone-d500-account-create-update-fpfdr\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.298315 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c7c6-account-create-update-bzdlt"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.299323 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.300882 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.301594 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.320752 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c7c6-account-create-update-bzdlt"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.375648 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58599ed2-6176-4003-8bdc-2a1d805da51f-operator-scripts\") pod \"placement-c7c6-account-create-update-bzdlt\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.375781 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf2pd\" (UniqueName: \"kubernetes.io/projected/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-kube-api-access-vf2pd\") pod \"placement-db-create-j9kwf\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.375833 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg4kr\" (UniqueName: \"kubernetes.io/projected/58599ed2-6176-4003-8bdc-2a1d805da51f-kube-api-access-kg4kr\") pod \"placement-c7c6-account-create-update-bzdlt\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.375895 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-operator-scripts\") pod \"placement-db-create-j9kwf\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.376602 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-operator-scripts\") pod \"placement-db-create-j9kwf\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.395272 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf2pd\" (UniqueName: \"kubernetes.io/projected/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-kube-api-access-vf2pd\") pod \"placement-db-create-j9kwf\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.422926 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.476754 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg4kr\" (UniqueName: \"kubernetes.io/projected/58599ed2-6176-4003-8bdc-2a1d805da51f-kube-api-access-kg4kr\") pod \"placement-c7c6-account-create-update-bzdlt\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.476858 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58599ed2-6176-4003-8bdc-2a1d805da51f-operator-scripts\") pod \"placement-c7c6-account-create-update-bzdlt\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.477917 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58599ed2-6176-4003-8bdc-2a1d805da51f-operator-scripts\") pod \"placement-c7c6-account-create-update-bzdlt\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.499671 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg4kr\" (UniqueName: \"kubernetes.io/projected/58599ed2-6176-4003-8bdc-2a1d805da51f-kube-api-access-kg4kr\") pod \"placement-c7c6-account-create-update-bzdlt\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.507057 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.640905 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.708095 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mj8nd"] Mar 10 09:21:44 crc kubenswrapper[4883]: W0310 09:21:44.711141 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod258d7844_9a92_460a_a768_a5dca2fb5db9.slice/crio-3909945d92aa80c72296c578675f436135ee7b4e0c7c04b0dc225eb1b1e3ef34 WatchSource:0}: Error finding container 3909945d92aa80c72296c578675f436135ee7b4e0c7c04b0dc225eb1b1e3ef34: Status 404 returned error can't find the container with id 3909945d92aa80c72296c578675f436135ee7b4e0c7c04b0dc225eb1b1e3ef34 Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.860266 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d500-account-create-update-fpfdr"] Mar 10 09:21:44 crc kubenswrapper[4883]: W0310 09:21:44.863547 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod486b3226_21be_4783_8b29_abaf747a7693.slice/crio-e075a6fd8268070db8b87167104eaeec692650a48d7b0ecfc8ce8d031b32fbf9 WatchSource:0}: Error finding container e075a6fd8268070db8b87167104eaeec692650a48d7b0ecfc8ce8d031b32fbf9: Status 404 returned error can't find the container with id e075a6fd8268070db8b87167104eaeec692650a48d7b0ecfc8ce8d031b32fbf9 Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.933895 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-j9kwf"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.988059 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d500-account-create-update-fpfdr" event={"ID":"486b3226-21be-4783-8b29-abaf747a7693","Type":"ContainerStarted","Data":"e075a6fd8268070db8b87167104eaeec692650a48d7b0ecfc8ce8d031b32fbf9"} Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.989947 4883 generic.go:334] "Generic (PLEG): container finished" podID="84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" containerID="b981b386d21855c9b21b1262acdcccebfb4995ef8da840373e95a5a29e03699c" exitCode=0 Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.990010 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pv8r6" event={"ID":"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34","Type":"ContainerDied","Data":"b981b386d21855c9b21b1262acdcccebfb4995ef8da840373e95a5a29e03699c"} Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.990041 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pv8r6" event={"ID":"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34","Type":"ContainerStarted","Data":"c3c2965142f5b5713694be3cd8baab99c20a6e06108db60b12703ba3ffb904b6"} Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.993931 4883 generic.go:334] "Generic (PLEG): container finished" podID="698612ed-a736-4d3d-9a0e-4c75fdd1400f" containerID="e7877e4d896a5e48fb94d0bb9e636d179a97dbbe531d524d3bf059533ec08d74" exitCode=0 Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.993983 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8903-account-create-update-lxrp4" event={"ID":"698612ed-a736-4d3d-9a0e-4c75fdd1400f","Type":"ContainerDied","Data":"e7877e4d896a5e48fb94d0bb9e636d179a97dbbe531d524d3bf059533ec08d74"} Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.994002 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8903-account-create-update-lxrp4" event={"ID":"698612ed-a736-4d3d-9a0e-4c75fdd1400f","Type":"ContainerStarted","Data":"b951262476bceeb4b809409cf98537b24497c176feb26e10d9d261770e483efe"} Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:44.995946 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mj8nd" event={"ID":"258d7844-9a92-460a-a768-a5dca2fb5db9","Type":"ContainerStarted","Data":"067357e3a311966c2851280ac43f9cd60519f964886eed01635d2c8acbc8045b"} Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:44.995972 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mj8nd" event={"ID":"258d7844-9a92-460a-a768-a5dca2fb5db9","Type":"ContainerStarted","Data":"3909945d92aa80c72296c578675f436135ee7b4e0c7c04b0dc225eb1b1e3ef34"} Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.003121 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-j9kwf" event={"ID":"6195b8a8-c8aa-4d92-b58b-066a2df99bd3","Type":"ContainerStarted","Data":"136224520a357418a498376a1cbdd0153ccb3d2fdb86788ac7b44dece177b573"} Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.071975 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c7c6-account-create-update-bzdlt"] Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.232051 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-qq85c"] Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.232292 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerName="dnsmasq-dns" containerID="cri-o://8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351" gracePeriod=10 Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.248369 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.267977 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-ll7zk"] Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.269415 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.302648 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27qgz\" (UniqueName: \"kubernetes.io/projected/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-kube-api-access-27qgz\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.302697 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.302754 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-dns-svc\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.302785 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.302808 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.310331 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-ll7zk"] Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.405382 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-dns-svc\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.405489 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.405542 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.405737 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27qgz\" (UniqueName: \"kubernetes.io/projected/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-kube-api-access-27qgz\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.405786 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.406315 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-dns-svc\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.406694 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.406949 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.407450 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.457174 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27qgz\" (UniqueName: \"kubernetes.io/projected/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-kube-api-access-27qgz\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.590800 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.890111 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.012270 4883 generic.go:334] "Generic (PLEG): container finished" podID="58599ed2-6176-4003-8bdc-2a1d805da51f" containerID="9b3a01ef455743297929fe3e8d915e6b5c1a6d87ee8313151edd54b3c5c1c1d3" exitCode=0 Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.012395 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c7c6-account-create-update-bzdlt" event={"ID":"58599ed2-6176-4003-8bdc-2a1d805da51f","Type":"ContainerDied","Data":"9b3a01ef455743297929fe3e8d915e6b5c1a6d87ee8313151edd54b3c5c1c1d3"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.012498 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c7c6-account-create-update-bzdlt" event={"ID":"58599ed2-6176-4003-8bdc-2a1d805da51f","Type":"ContainerStarted","Data":"fb11cd34a85d26912302a2922a691d05843fde0fdc020b7213b5e6c9c65ef2fe"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.014454 4883 generic.go:334] "Generic (PLEG): container finished" podID="258d7844-9a92-460a-a768-a5dca2fb5db9" containerID="067357e3a311966c2851280ac43f9cd60519f964886eed01635d2c8acbc8045b" exitCode=0 Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.014555 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mj8nd" event={"ID":"258d7844-9a92-460a-a768-a5dca2fb5db9","Type":"ContainerDied","Data":"067357e3a311966c2851280ac43f9cd60519f964886eed01635d2c8acbc8045b"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.015145 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-sb\") pod \"34eb524a-8ba3-4157-8a0c-efd069843d47\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.015182 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-nb\") pod \"34eb524a-8ba3-4157-8a0c-efd069843d47\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.015241 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-config\") pod \"34eb524a-8ba3-4157-8a0c-efd069843d47\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.016250 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqx5f\" (UniqueName: \"kubernetes.io/projected/34eb524a-8ba3-4157-8a0c-efd069843d47-kube-api-access-cqx5f\") pod \"34eb524a-8ba3-4157-8a0c-efd069843d47\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.016378 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-dns-svc\") pod \"34eb524a-8ba3-4157-8a0c-efd069843d47\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.021032 4883 generic.go:334] "Generic (PLEG): container finished" podID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerID="8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351" exitCode=0 Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.021105 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" event={"ID":"34eb524a-8ba3-4157-8a0c-efd069843d47","Type":"ContainerDied","Data":"8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.021128 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" event={"ID":"34eb524a-8ba3-4157-8a0c-efd069843d47","Type":"ContainerDied","Data":"217b807bc42e3fb9b90b1828f1947e2dec1e612d0200d8cf3f6228734fe68a4c"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.021149 4883 scope.go:117] "RemoveContainer" containerID="8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.021304 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.030759 4883 generic.go:334] "Generic (PLEG): container finished" podID="6195b8a8-c8aa-4d92-b58b-066a2df99bd3" containerID="86cc309342e04f12de9f243fac1e7adc270651f62f05738383b3854942ebc072" exitCode=0 Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.030839 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-j9kwf" event={"ID":"6195b8a8-c8aa-4d92-b58b-066a2df99bd3","Type":"ContainerDied","Data":"86cc309342e04f12de9f243fac1e7adc270651f62f05738383b3854942ebc072"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.033749 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-ll7zk"] Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.036647 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34eb524a-8ba3-4157-8a0c-efd069843d47-kube-api-access-cqx5f" (OuterVolumeSpecName: "kube-api-access-cqx5f") pod "34eb524a-8ba3-4157-8a0c-efd069843d47" (UID: "34eb524a-8ba3-4157-8a0c-efd069843d47"). InnerVolumeSpecName "kube-api-access-cqx5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.037770 4883 generic.go:334] "Generic (PLEG): container finished" podID="486b3226-21be-4783-8b29-abaf747a7693" containerID="6677f5c2edc8cf5df63041699d2713762ffd5b4bdf18bb3f374e397d55004166" exitCode=0 Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.038031 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d500-account-create-update-fpfdr" event={"ID":"486b3226-21be-4783-8b29-abaf747a7693","Type":"ContainerDied","Data":"6677f5c2edc8cf5df63041699d2713762ffd5b4bdf18bb3f374e397d55004166"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.063649 4883 scope.go:117] "RemoveContainer" containerID="ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.067936 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "34eb524a-8ba3-4157-8a0c-efd069843d47" (UID: "34eb524a-8ba3-4157-8a0c-efd069843d47"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.075355 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-config" (OuterVolumeSpecName: "config") pod "34eb524a-8ba3-4157-8a0c-efd069843d47" (UID: "34eb524a-8ba3-4157-8a0c-efd069843d47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.080026 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34eb524a-8ba3-4157-8a0c-efd069843d47" (UID: "34eb524a-8ba3-4157-8a0c-efd069843d47"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.081727 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "34eb524a-8ba3-4157-8a0c-efd069843d47" (UID: "34eb524a-8ba3-4157-8a0c-efd069843d47"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.099013 4883 scope.go:117] "RemoveContainer" containerID="8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351" Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.099438 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351\": container with ID starting with 8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351 not found: ID does not exist" containerID="8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.099495 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351"} err="failed to get container status \"8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351\": rpc error: code = NotFound desc = could not find container \"8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351\": container with ID starting with 8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351 not found: ID does not exist" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.099523 4883 scope.go:117] "RemoveContainer" containerID="ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210" Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.100700 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210\": container with ID starting with ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210 not found: ID does not exist" containerID="ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.100727 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210"} err="failed to get container status \"ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210\": rpc error: code = NotFound desc = could not find container \"ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210\": container with ID starting with ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210 not found: ID does not exist" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.119678 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.119701 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.119710 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.119720 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.119729 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqx5f\" (UniqueName: \"kubernetes.io/projected/34eb524a-8ba3-4157-8a0c-efd069843d47-kube-api-access-cqx5f\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.364979 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-qq85c"] Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.368951 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.377715 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-qq85c"] Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.402726 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.414927 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.415409 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerName="dnsmasq-dns" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.415515 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerName="dnsmasq-dns" Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.415595 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698612ed-a736-4d3d-9a0e-4c75fdd1400f" containerName="mariadb-account-create-update" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.415644 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="698612ed-a736-4d3d-9a0e-4c75fdd1400f" containerName="mariadb-account-create-update" Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.415707 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerName="init" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.415753 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerName="init" Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.415821 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" containerName="mariadb-database-create" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.415868 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" containerName="mariadb-database-create" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.416150 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="698612ed-a736-4d3d-9a0e-4c75fdd1400f" containerName="mariadb-account-create-update" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.416233 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerName="dnsmasq-dns" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.416291 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" containerName="mariadb-database-create" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.423496 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.423688 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.425443 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-operator-scripts\") pod \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.425749 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwvs8\" (UniqueName: \"kubernetes.io/projected/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-kube-api-access-pwvs8\") pod \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.426374 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" (UID: "84b33a38-a6de-4d7e-b24d-ecf5f95f6c34"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.426464 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.427317 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.427985 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.428132 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lfqvv" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.428160 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.437382 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-kube-api-access-pwvs8" (OuterVolumeSpecName: "kube-api-access-pwvs8") pod "84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" (UID: "84b33a38-a6de-4d7e-b24d-ecf5f95f6c34"). InnerVolumeSpecName "kube-api-access-pwvs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.440050 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.482268 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-n4vhh"] Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.483514 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258d7844-9a92-460a-a768-a5dca2fb5db9" containerName="mariadb-database-create" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.483600 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="258d7844-9a92-460a-a768-a5dca2fb5db9" containerName="mariadb-database-create" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.483844 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="258d7844-9a92-460a-a768-a5dca2fb5db9" containerName="mariadb-database-create" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.484636 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.488320 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.488634 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.488767 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.499249 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-n4vhh"] Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.527768 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnddv\" (UniqueName: \"kubernetes.io/projected/698612ed-a736-4d3d-9a0e-4c75fdd1400f-kube-api-access-dnddv\") pod \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.528317 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxrzk\" (UniqueName: \"kubernetes.io/projected/258d7844-9a92-460a-a768-a5dca2fb5db9-kube-api-access-dxrzk\") pod \"258d7844-9a92-460a-a768-a5dca2fb5db9\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.528374 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258d7844-9a92-460a-a768-a5dca2fb5db9-operator-scripts\") pod \"258d7844-9a92-460a-a768-a5dca2fb5db9\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.528799 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698612ed-a736-4d3d-9a0e-4c75fdd1400f-operator-scripts\") pod \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529131 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258d7844-9a92-460a-a768-a5dca2fb5db9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "258d7844-9a92-460a-a768-a5dca2fb5db9" (UID: "258d7844-9a92-460a-a768-a5dca2fb5db9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529206 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698612ed-a736-4d3d-9a0e-4c75fdd1400f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "698612ed-a736-4d3d-9a0e-4c75fdd1400f" (UID: "698612ed-a736-4d3d-9a0e-4c75fdd1400f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529257 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529312 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/39fdf41f-a914-4d0f-8d0c-5e378567a2db-cache\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529498 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng2x8\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-kube-api-access-ng2x8\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529564 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-ring-data-devices\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529611 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-combined-ca-bundle\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529658 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s66g\" (UniqueName: \"kubernetes.io/projected/cbe93226-96c7-4854-abdc-4afe54ad7ad5-kube-api-access-9s66g\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529760 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cbe93226-96c7-4854-abdc-4afe54ad7ad5-etc-swift\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529818 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-scripts\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529833 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fdf41f-a914-4d0f-8d0c-5e378567a2db-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529885 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-swiftconf\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529906 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/39fdf41f-a914-4d0f-8d0c-5e378567a2db-lock\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529933 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529952 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-dispersionconf\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.530139 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698612ed-a736-4d3d-9a0e-4c75fdd1400f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.530177 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwvs8\" (UniqueName: \"kubernetes.io/projected/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-kube-api-access-pwvs8\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.530191 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258d7844-9a92-460a-a768-a5dca2fb5db9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.532037 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258d7844-9a92-460a-a768-a5dca2fb5db9-kube-api-access-dxrzk" (OuterVolumeSpecName: "kube-api-access-dxrzk") pod "258d7844-9a92-460a-a768-a5dca2fb5db9" (UID: "258d7844-9a92-460a-a768-a5dca2fb5db9"). InnerVolumeSpecName "kube-api-access-dxrzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.532994 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698612ed-a736-4d3d-9a0e-4c75fdd1400f-kube-api-access-dnddv" (OuterVolumeSpecName: "kube-api-access-dnddv") pod "698612ed-a736-4d3d-9a0e-4c75fdd1400f" (UID: "698612ed-a736-4d3d-9a0e-4c75fdd1400f"). InnerVolumeSpecName "kube-api-access-dnddv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632100 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng2x8\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-kube-api-access-ng2x8\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632155 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-ring-data-devices\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632184 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-combined-ca-bundle\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632214 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s66g\" (UniqueName: \"kubernetes.io/projected/cbe93226-96c7-4854-abdc-4afe54ad7ad5-kube-api-access-9s66g\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632259 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cbe93226-96c7-4854-abdc-4afe54ad7ad5-etc-swift\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632290 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-scripts\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632309 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fdf41f-a914-4d0f-8d0c-5e378567a2db-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632334 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-swiftconf\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632355 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/39fdf41f-a914-4d0f-8d0c-5e378567a2db-lock\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632377 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632393 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-dispersionconf\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632442 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632463 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/39fdf41f-a914-4d0f-8d0c-5e378567a2db-cache\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632545 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnddv\" (UniqueName: \"kubernetes.io/projected/698612ed-a736-4d3d-9a0e-4c75fdd1400f-kube-api-access-dnddv\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632561 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxrzk\" (UniqueName: \"kubernetes.io/projected/258d7844-9a92-460a-a768-a5dca2fb5db9-kube-api-access-dxrzk\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632995 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/39fdf41f-a914-4d0f-8d0c-5e378567a2db-lock\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.633041 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/39fdf41f-a914-4d0f-8d0c-5e378567a2db-cache\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.633165 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.633238 4883 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.633259 4883 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.633324 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift podName:39fdf41f-a914-4d0f-8d0c-5e378567a2db nodeName:}" failed. No retries permitted until 2026-03-10 09:21:47.133304176 +0000 UTC m=+1093.388202065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift") pod "swift-storage-0" (UID: "39fdf41f-a914-4d0f-8d0c-5e378567a2db") : configmap "swift-ring-files" not found Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.633914 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-ring-data-devices\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.633917 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-scripts\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.634944 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cbe93226-96c7-4854-abdc-4afe54ad7ad5-etc-swift\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.638566 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-combined-ca-bundle\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.639031 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fdf41f-a914-4d0f-8d0c-5e378567a2db-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.639883 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-dispersionconf\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.640439 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-swiftconf\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.649190 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng2x8\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-kube-api-access-ng2x8\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.649687 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s66g\" (UniqueName: \"kubernetes.io/projected/cbe93226-96c7-4854-abdc-4afe54ad7ad5-kube-api-access-9s66g\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.653836 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.820613 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.054014 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-n4vhh"] Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.065182 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8903-account-create-update-lxrp4" event={"ID":"698612ed-a736-4d3d-9a0e-4c75fdd1400f","Type":"ContainerDied","Data":"b951262476bceeb4b809409cf98537b24497c176feb26e10d9d261770e483efe"} Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.065498 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b951262476bceeb4b809409cf98537b24497c176feb26e10d9d261770e483efe" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.065590 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.079862 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mj8nd" event={"ID":"258d7844-9a92-460a-a768-a5dca2fb5db9","Type":"ContainerDied","Data":"3909945d92aa80c72296c578675f436135ee7b4e0c7c04b0dc225eb1b1e3ef34"} Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.080291 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3909945d92aa80c72296c578675f436135ee7b4e0c7c04b0dc225eb1b1e3ef34" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.080348 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.081710 4883 generic.go:334] "Generic (PLEG): container finished" podID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerID="e241856c7e2d9bdc80ba8f22b6df1569df773a3d438f85a0d6ce70af2f1197e4" exitCode=0 Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.081811 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" event={"ID":"7b8b44ba-0853-4287-a0ba-ef1607c66d7b","Type":"ContainerDied","Data":"e241856c7e2d9bdc80ba8f22b6df1569df773a3d438f85a0d6ce70af2f1197e4"} Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.081849 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" event={"ID":"7b8b44ba-0853-4287-a0ba-ef1607c66d7b","Type":"ContainerStarted","Data":"4fe8d37616588503394b5e0c543034c9d75405c62d38c6a3b4276c05a61d4d46"} Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.091058 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.091885 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pv8r6" event={"ID":"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34","Type":"ContainerDied","Data":"c3c2965142f5b5713694be3cd8baab99c20a6e06108db60b12703ba3ffb904b6"} Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.091928 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3c2965142f5b5713694be3cd8baab99c20a6e06108db60b12703ba3ffb904b6" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.144356 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:47 crc kubenswrapper[4883]: E0310 09:21:47.145535 4883 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 09:21:47 crc kubenswrapper[4883]: E0310 09:21:47.145570 4883 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 09:21:47 crc kubenswrapper[4883]: E0310 09:21:47.145614 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift podName:39fdf41f-a914-4d0f-8d0c-5e378567a2db nodeName:}" failed. No retries permitted until 2026-03-10 09:21:48.145596013 +0000 UTC m=+1094.400493902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift") pod "swift-storage-0" (UID: "39fdf41f-a914-4d0f-8d0c-5e378567a2db") : configmap "swift-ring-files" not found Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.448184 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.544051 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.550310 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.551382 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-operator-scripts\") pod \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.551587 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf2pd\" (UniqueName: \"kubernetes.io/projected/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-kube-api-access-vf2pd\") pod \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.552195 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6195b8a8-c8aa-4d92-b58b-066a2df99bd3" (UID: "6195b8a8-c8aa-4d92-b58b-066a2df99bd3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.552335 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.560659 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-kube-api-access-vf2pd" (OuterVolumeSpecName: "kube-api-access-vf2pd") pod "6195b8a8-c8aa-4d92-b58b-066a2df99bd3" (UID: "6195b8a8-c8aa-4d92-b58b-066a2df99bd3"). InnerVolumeSpecName "kube-api-access-vf2pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.653134 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58599ed2-6176-4003-8bdc-2a1d805da51f-operator-scripts\") pod \"58599ed2-6176-4003-8bdc-2a1d805da51f\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.653187 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg4kr\" (UniqueName: \"kubernetes.io/projected/58599ed2-6176-4003-8bdc-2a1d805da51f-kube-api-access-kg4kr\") pod \"58599ed2-6176-4003-8bdc-2a1d805da51f\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.653815 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486b3226-21be-4783-8b29-abaf747a7693-operator-scripts\") pod \"486b3226-21be-4783-8b29-abaf747a7693\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.653876 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvwk5\" (UniqueName: \"kubernetes.io/projected/486b3226-21be-4783-8b29-abaf747a7693-kube-api-access-mvwk5\") pod \"486b3226-21be-4783-8b29-abaf747a7693\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.653807 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58599ed2-6176-4003-8bdc-2a1d805da51f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "58599ed2-6176-4003-8bdc-2a1d805da51f" (UID: "58599ed2-6176-4003-8bdc-2a1d805da51f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.654182 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58599ed2-6176-4003-8bdc-2a1d805da51f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.654208 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf2pd\" (UniqueName: \"kubernetes.io/projected/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-kube-api-access-vf2pd\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.654252 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/486b3226-21be-4783-8b29-abaf747a7693-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "486b3226-21be-4783-8b29-abaf747a7693" (UID: "486b3226-21be-4783-8b29-abaf747a7693"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.656430 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58599ed2-6176-4003-8bdc-2a1d805da51f-kube-api-access-kg4kr" (OuterVolumeSpecName: "kube-api-access-kg4kr") pod "58599ed2-6176-4003-8bdc-2a1d805da51f" (UID: "58599ed2-6176-4003-8bdc-2a1d805da51f"). InnerVolumeSpecName "kube-api-access-kg4kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.656784 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486b3226-21be-4783-8b29-abaf747a7693-kube-api-access-mvwk5" (OuterVolumeSpecName: "kube-api-access-mvwk5") pod "486b3226-21be-4783-8b29-abaf747a7693" (UID: "486b3226-21be-4783-8b29-abaf747a7693"). InnerVolumeSpecName "kube-api-access-mvwk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.755670 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg4kr\" (UniqueName: \"kubernetes.io/projected/58599ed2-6176-4003-8bdc-2a1d805da51f-kube-api-access-kg4kr\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.755699 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486b3226-21be-4783-8b29-abaf747a7693-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.755711 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvwk5\" (UniqueName: \"kubernetes.io/projected/486b3226-21be-4783-8b29-abaf747a7693-kube-api-access-mvwk5\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.118920 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" path="/var/lib/kubelet/pods/34eb524a-8ba3-4157-8a0c-efd069843d47/volumes" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.166131 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:48 crc kubenswrapper[4883]: E0310 09:21:48.167498 4883 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 09:21:48 crc kubenswrapper[4883]: E0310 09:21:48.167526 4883 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 09:21:48 crc kubenswrapper[4883]: E0310 09:21:48.167576 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift podName:39fdf41f-a914-4d0f-8d0c-5e378567a2db nodeName:}" failed. No retries permitted until 2026-03-10 09:21:50.167559746 +0000 UTC m=+1096.422457636 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift") pod "swift-storage-0" (UID: "39fdf41f-a914-4d0f-8d0c-5e378567a2db") : configmap "swift-ring-files" not found Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.178833 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" event={"ID":"7b8b44ba-0853-4287-a0ba-ef1607c66d7b","Type":"ContainerStarted","Data":"ab3d26f0cc1f7b7d4aa73624b5d0a56d07f024661893458f62e11a314a4cd5c5"} Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.179897 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.256995 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-j9kwf" event={"ID":"6195b8a8-c8aa-4d92-b58b-066a2df99bd3","Type":"ContainerDied","Data":"136224520a357418a498376a1cbdd0153ccb3d2fdb86788ac7b44dece177b573"} Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.257036 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="136224520a357418a498376a1cbdd0153ccb3d2fdb86788ac7b44dece177b573" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.257113 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.270239 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n4vhh" event={"ID":"cbe93226-96c7-4854-abdc-4afe54ad7ad5","Type":"ContainerStarted","Data":"11c349d8ba3b1aaf24065124e390760d2cc6670f859985a574e75a6f9d822d8c"} Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.272122 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d500-account-create-update-fpfdr" event={"ID":"486b3226-21be-4783-8b29-abaf747a7693","Type":"ContainerDied","Data":"e075a6fd8268070db8b87167104eaeec692650a48d7b0ecfc8ce8d031b32fbf9"} Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.272177 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e075a6fd8268070db8b87167104eaeec692650a48d7b0ecfc8ce8d031b32fbf9" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.272280 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.282786 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c7c6-account-create-update-bzdlt" event={"ID":"58599ed2-6176-4003-8bdc-2a1d805da51f","Type":"ContainerDied","Data":"fb11cd34a85d26912302a2922a691d05843fde0fdc020b7213b5e6c9c65ef2fe"} Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.282833 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb11cd34a85d26912302a2922a691d05843fde0fdc020b7213b5e6c9c65ef2fe" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.282912 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.437153 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" podStartSLOduration=3.437103804 podStartE2EDuration="3.437103804s" podCreationTimestamp="2026-03-10 09:21:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:21:48.213763858 +0000 UTC m=+1094.468661747" watchObservedRunningTime="2026-03-10 09:21:48.437103804 +0000 UTC m=+1094.692001692" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443184 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-xtrqg"] Mar 10 09:21:48 crc kubenswrapper[4883]: E0310 09:21:48.443595 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486b3226-21be-4783-8b29-abaf747a7693" containerName="mariadb-account-create-update" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443614 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="486b3226-21be-4783-8b29-abaf747a7693" containerName="mariadb-account-create-update" Mar 10 09:21:48 crc kubenswrapper[4883]: E0310 09:21:48.443637 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58599ed2-6176-4003-8bdc-2a1d805da51f" containerName="mariadb-account-create-update" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443644 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="58599ed2-6176-4003-8bdc-2a1d805da51f" containerName="mariadb-account-create-update" Mar 10 09:21:48 crc kubenswrapper[4883]: E0310 09:21:48.443659 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6195b8a8-c8aa-4d92-b58b-066a2df99bd3" containerName="mariadb-database-create" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443666 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6195b8a8-c8aa-4d92-b58b-066a2df99bd3" containerName="mariadb-database-create" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443828 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6195b8a8-c8aa-4d92-b58b-066a2df99bd3" containerName="mariadb-database-create" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443848 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="58599ed2-6176-4003-8bdc-2a1d805da51f" containerName="mariadb-account-create-update" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443864 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="486b3226-21be-4783-8b29-abaf747a7693" containerName="mariadb-account-create-update" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.447078 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.449891 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.450701 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-g4r2q" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.455584 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xtrqg"] Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.577883 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qnd\" (UniqueName: \"kubernetes.io/projected/d5485539-c722-477d-b595-649e07eac50e-kube-api-access-s9qnd\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.577957 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-combined-ca-bundle\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.578010 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-config-data\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.578146 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-db-sync-config-data\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.680244 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-db-sync-config-data\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.680355 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qnd\" (UniqueName: \"kubernetes.io/projected/d5485539-c722-477d-b595-649e07eac50e-kube-api-access-s9qnd\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.680383 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-combined-ca-bundle\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.680437 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-config-data\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.685575 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-config-data\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.687059 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-db-sync-config-data\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.688865 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-combined-ca-bundle\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.696233 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qnd\" (UniqueName: \"kubernetes.io/projected/d5485539-c722-477d-b595-649e07eac50e-kube-api-access-s9qnd\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.773701 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.272449 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xtrqg"] Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.717256 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zbx4l"] Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.720307 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.722390 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.725593 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zbx4l"] Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.909525 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j8q8\" (UniqueName: \"kubernetes.io/projected/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-kube-api-access-8j8q8\") pod \"root-account-create-update-zbx4l\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.909583 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-operator-scripts\") pod \"root-account-create-update-zbx4l\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:50 crc kubenswrapper[4883]: I0310 09:21:50.012208 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j8q8\" (UniqueName: \"kubernetes.io/projected/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-kube-api-access-8j8q8\") pod \"root-account-create-update-zbx4l\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:50 crc kubenswrapper[4883]: I0310 09:21:50.012297 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-operator-scripts\") pod \"root-account-create-update-zbx4l\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:50 crc kubenswrapper[4883]: I0310 09:21:50.013307 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-operator-scripts\") pod \"root-account-create-update-zbx4l\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:50 crc kubenswrapper[4883]: I0310 09:21:50.041069 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j8q8\" (UniqueName: \"kubernetes.io/projected/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-kube-api-access-8j8q8\") pod \"root-account-create-update-zbx4l\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:50 crc kubenswrapper[4883]: E0310 09:21:50.217972 4883 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 09:21:50 crc kubenswrapper[4883]: E0310 09:21:50.218022 4883 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 09:21:50 crc kubenswrapper[4883]: E0310 09:21:50.218110 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift podName:39fdf41f-a914-4d0f-8d0c-5e378567a2db nodeName:}" failed. No retries permitted until 2026-03-10 09:21:54.218080084 +0000 UTC m=+1100.472977974 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift") pod "swift-storage-0" (UID: "39fdf41f-a914-4d0f-8d0c-5e378567a2db") : configmap "swift-ring-files" not found Mar 10 09:21:50 crc kubenswrapper[4883]: I0310 09:21:50.222793 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:50 crc kubenswrapper[4883]: I0310 09:21:50.338119 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:51 crc kubenswrapper[4883]: W0310 09:21:51.153651 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5485539_c722_477d_b595_649e07eac50e.slice/crio-2cb0092f81baba80b5e32ffbad7cc2ef9d1d4d56b87128258545a452766054f7 WatchSource:0}: Error finding container 2cb0092f81baba80b5e32ffbad7cc2ef9d1d4d56b87128258545a452766054f7: Status 404 returned error can't find the container with id 2cb0092f81baba80b5e32ffbad7cc2ef9d1d4d56b87128258545a452766054f7 Mar 10 09:21:51 crc kubenswrapper[4883]: I0310 09:21:51.335944 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xtrqg" event={"ID":"d5485539-c722-477d-b595-649e07eac50e","Type":"ContainerStarted","Data":"2cb0092f81baba80b5e32ffbad7cc2ef9d1d4d56b87128258545a452766054f7"} Mar 10 09:21:51 crc kubenswrapper[4883]: I0310 09:21:51.568012 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zbx4l"] Mar 10 09:21:51 crc kubenswrapper[4883]: W0310 09:21:51.574179 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f11ff8e_7bba_408d_9d5f_6a3f3d16c280.slice/crio-7ca2ef89aed9e45eb9d3b9332aa73ac48af177844024c608d4f7b4c213e07acb WatchSource:0}: Error finding container 7ca2ef89aed9e45eb9d3b9332aa73ac48af177844024c608d4f7b4c213e07acb: Status 404 returned error can't find the container with id 7ca2ef89aed9e45eb9d3b9332aa73ac48af177844024c608d4f7b4c213e07acb Mar 10 09:21:52 crc kubenswrapper[4883]: I0310 09:21:52.345534 4883 generic.go:334] "Generic (PLEG): container finished" podID="5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" containerID="86f2b3c9600146e785777999cdc5d4ea906b5ad635853fcbc695d4a1b48ea493" exitCode=0 Mar 10 09:21:52 crc kubenswrapper[4883]: I0310 09:21:52.345652 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zbx4l" event={"ID":"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280","Type":"ContainerDied","Data":"86f2b3c9600146e785777999cdc5d4ea906b5ad635853fcbc695d4a1b48ea493"} Mar 10 09:21:52 crc kubenswrapper[4883]: I0310 09:21:52.347928 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zbx4l" event={"ID":"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280","Type":"ContainerStarted","Data":"7ca2ef89aed9e45eb9d3b9332aa73ac48af177844024c608d4f7b4c213e07acb"} Mar 10 09:21:52 crc kubenswrapper[4883]: I0310 09:21:52.347957 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n4vhh" event={"ID":"cbe93226-96c7-4854-abdc-4afe54ad7ad5","Type":"ContainerStarted","Data":"98e65d47610f9e922ce3470f42049e9e4521b7087bcbd4c73749d28c484a5cf9"} Mar 10 09:21:52 crc kubenswrapper[4883]: I0310 09:21:52.379084 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-n4vhh" podStartSLOduration=2.26811956 podStartE2EDuration="6.379066627s" podCreationTimestamp="2026-03-10 09:21:46 +0000 UTC" firstStartedPulling="2026-03-10 09:21:47.072902964 +0000 UTC m=+1093.327800853" lastFinishedPulling="2026-03-10 09:21:51.183850031 +0000 UTC m=+1097.438747920" observedRunningTime="2026-03-10 09:21:52.376540263 +0000 UTC m=+1098.631438152" watchObservedRunningTime="2026-03-10 09:21:52.379066627 +0000 UTC m=+1098.633964516" Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.670503 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.786017 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j8q8\" (UniqueName: \"kubernetes.io/projected/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-kube-api-access-8j8q8\") pod \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.786155 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-operator-scripts\") pod \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.786690 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" (UID: "5f11ff8e-7bba-408d-9d5f-6a3f3d16c280"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.787085 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.793542 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-kube-api-access-8j8q8" (OuterVolumeSpecName: "kube-api-access-8j8q8") pod "5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" (UID: "5f11ff8e-7bba-408d-9d5f-6a3f3d16c280"). InnerVolumeSpecName "kube-api-access-8j8q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.888804 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j8q8\" (UniqueName: \"kubernetes.io/projected/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-kube-api-access-8j8q8\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:54 crc kubenswrapper[4883]: I0310 09:21:54.297347 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:54 crc kubenswrapper[4883]: E0310 09:21:54.297597 4883 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 09:21:54 crc kubenswrapper[4883]: E0310 09:21:54.297616 4883 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 09:21:54 crc kubenswrapper[4883]: E0310 09:21:54.297667 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift podName:39fdf41f-a914-4d0f-8d0c-5e378567a2db nodeName:}" failed. No retries permitted until 2026-03-10 09:22:02.297652693 +0000 UTC m=+1108.552550581 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift") pod "swift-storage-0" (UID: "39fdf41f-a914-4d0f-8d0c-5e378567a2db") : configmap "swift-ring-files" not found Mar 10 09:21:54 crc kubenswrapper[4883]: I0310 09:21:54.372911 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zbx4l" event={"ID":"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280","Type":"ContainerDied","Data":"7ca2ef89aed9e45eb9d3b9332aa73ac48af177844024c608d4f7b4c213e07acb"} Mar 10 09:21:54 crc kubenswrapper[4883]: I0310 09:21:54.372953 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ca2ef89aed9e45eb9d3b9332aa73ac48af177844024c608d4f7b4c213e07acb" Mar 10 09:21:54 crc kubenswrapper[4883]: I0310 09:21:54.373011 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:55 crc kubenswrapper[4883]: I0310 09:21:55.592732 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:55 crc kubenswrapper[4883]: I0310 09:21:55.638318 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-ckpnm"] Mar 10 09:21:55 crc kubenswrapper[4883]: I0310 09:21:55.638624 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" podUID="6b114327-1a63-488a-aace-0488259b1278" containerName="dnsmasq-dns" containerID="cri-o://4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc" gracePeriod=10 Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.114955 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.136750 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-config\") pod \"6b114327-1a63-488a-aace-0488259b1278\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.136823 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-dns-svc\") pod \"6b114327-1a63-488a-aace-0488259b1278\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.136981 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cksz8\" (UniqueName: \"kubernetes.io/projected/6b114327-1a63-488a-aace-0488259b1278-kube-api-access-cksz8\") pod \"6b114327-1a63-488a-aace-0488259b1278\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.155149 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b114327-1a63-488a-aace-0488259b1278-kube-api-access-cksz8" (OuterVolumeSpecName: "kube-api-access-cksz8") pod "6b114327-1a63-488a-aace-0488259b1278" (UID: "6b114327-1a63-488a-aace-0488259b1278"). InnerVolumeSpecName "kube-api-access-cksz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.185665 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-config" (OuterVolumeSpecName: "config") pod "6b114327-1a63-488a-aace-0488259b1278" (UID: "6b114327-1a63-488a-aace-0488259b1278"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.190809 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b114327-1a63-488a-aace-0488259b1278" (UID: "6b114327-1a63-488a-aace-0488259b1278"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.240799 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cksz8\" (UniqueName: \"kubernetes.io/projected/6b114327-1a63-488a-aace-0488259b1278-kube-api-access-cksz8\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.240833 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.240843 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.244969 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zbx4l"] Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.268115 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zbx4l"] Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.405132 4883 generic.go:334] "Generic (PLEG): container finished" podID="6b114327-1a63-488a-aace-0488259b1278" containerID="4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc" exitCode=0 Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.405212 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" event={"ID":"6b114327-1a63-488a-aace-0488259b1278","Type":"ContainerDied","Data":"4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc"} Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.405267 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" event={"ID":"6b114327-1a63-488a-aace-0488259b1278","Type":"ContainerDied","Data":"0f7fbcddfd5a9e8e2b01f02ad16e0ed221a09ead1d4f04beb704150072ccc53c"} Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.405266 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.405293 4883 scope.go:117] "RemoveContainer" containerID="4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.438509 4883 scope.go:117] "RemoveContainer" containerID="0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.448596 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-ckpnm"] Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.461711 4883 scope.go:117] "RemoveContainer" containerID="4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.462081 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-ckpnm"] Mar 10 09:21:56 crc kubenswrapper[4883]: E0310 09:21:56.462432 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc\": container with ID starting with 4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc not found: ID does not exist" containerID="4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.462497 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc"} err="failed to get container status \"4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc\": rpc error: code = NotFound desc = could not find container \"4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc\": container with ID starting with 4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc not found: ID does not exist" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.462533 4883 scope.go:117] "RemoveContainer" containerID="0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a" Mar 10 09:21:56 crc kubenswrapper[4883]: E0310 09:21:56.463033 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a\": container with ID starting with 0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a not found: ID does not exist" containerID="0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.463081 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a"} err="failed to get container status \"0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a\": rpc error: code = NotFound desc = could not find container \"0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a\": container with ID starting with 0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a not found: ID does not exist" Mar 10 09:21:57 crc kubenswrapper[4883]: I0310 09:21:57.413465 4883 generic.go:334] "Generic (PLEG): container finished" podID="cbe93226-96c7-4854-abdc-4afe54ad7ad5" containerID="98e65d47610f9e922ce3470f42049e9e4521b7087bcbd4c73749d28c484a5cf9" exitCode=0 Mar 10 09:21:57 crc kubenswrapper[4883]: I0310 09:21:57.413538 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n4vhh" event={"ID":"cbe93226-96c7-4854-abdc-4afe54ad7ad5","Type":"ContainerDied","Data":"98e65d47610f9e922ce3470f42049e9e4521b7087bcbd4c73749d28c484a5cf9"} Mar 10 09:21:58 crc kubenswrapper[4883]: I0310 09:21:58.092251 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" path="/var/lib/kubelet/pods/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280/volumes" Mar 10 09:21:58 crc kubenswrapper[4883]: I0310 09:21:58.093924 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b114327-1a63-488a-aace-0488259b1278" path="/var/lib/kubelet/pods/6b114327-1a63-488a-aace-0488259b1278/volumes" Mar 10 09:21:58 crc kubenswrapper[4883]: I0310 09:21:58.422423 4883 generic.go:334] "Generic (PLEG): container finished" podID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerID="8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5" exitCode=0 Mar 10 09:21:58 crc kubenswrapper[4883]: I0310 09:21:58.422600 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07","Type":"ContainerDied","Data":"8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5"} Mar 10 09:21:59 crc kubenswrapper[4883]: I0310 09:21:59.433084 4883 generic.go:334] "Generic (PLEG): container finished" podID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerID="cd676384987aa2244a851651d9f3c8c0df5750259fc99e409d5e9d090a18495f" exitCode=0 Mar 10 09:21:59 crc kubenswrapper[4883]: I0310 09:21:59.433177 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdb6ba72-d1c8-4022-9029-2e18784e1139","Type":"ContainerDied","Data":"cd676384987aa2244a851651d9f3c8c0df5750259fc99e409d5e9d090a18495f"} Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.146492 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552242-kz9jr"] Mar 10 09:22:00 crc kubenswrapper[4883]: E0310 09:22:00.146920 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b114327-1a63-488a-aace-0488259b1278" containerName="dnsmasq-dns" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.146941 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b114327-1a63-488a-aace-0488259b1278" containerName="dnsmasq-dns" Mar 10 09:22:00 crc kubenswrapper[4883]: E0310 09:22:00.146970 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" containerName="mariadb-account-create-update" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.146977 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" containerName="mariadb-account-create-update" Mar 10 09:22:00 crc kubenswrapper[4883]: E0310 09:22:00.146987 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b114327-1a63-488a-aace-0488259b1278" containerName="init" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.146993 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b114327-1a63-488a-aace-0488259b1278" containerName="init" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.147149 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b114327-1a63-488a-aace-0488259b1278" containerName="dnsmasq-dns" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.147169 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" containerName="mariadb-account-create-update" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.147804 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.149683 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.149716 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.149879 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.157024 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552242-kz9jr"] Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.316331 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv225\" (UniqueName: \"kubernetes.io/projected/12cada45-6ba5-4db1-9a13-3de652b390bb-kube-api-access-hv225\") pod \"auto-csr-approver-29552242-kz9jr\" (UID: \"12cada45-6ba5-4db1-9a13-3de652b390bb\") " pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.417914 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv225\" (UniqueName: \"kubernetes.io/projected/12cada45-6ba5-4db1-9a13-3de652b390bb-kube-api-access-hv225\") pod \"auto-csr-approver-29552242-kz9jr\" (UID: \"12cada45-6ba5-4db1-9a13-3de652b390bb\") " pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.446773 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv225\" (UniqueName: \"kubernetes.io/projected/12cada45-6ba5-4db1-9a13-3de652b390bb-kube-api-access-hv225\") pod \"auto-csr-approver-29552242-kz9jr\" (UID: \"12cada45-6ba5-4db1-9a13-3de652b390bb\") " pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.463861 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.256675 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-cdc4t"] Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.257900 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.260580 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.263888 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cdc4t"] Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.451823 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d523ed0-183e-4bec-a110-fe622b69ef79-operator-scripts\") pod \"root-account-create-update-cdc4t\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.451991 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbm4m\" (UniqueName: \"kubernetes.io/projected/5d523ed0-183e-4bec-a110-fe622b69ef79-kube-api-access-jbm4m\") pod \"root-account-create-update-cdc4t\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.465810 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.553879 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d523ed0-183e-4bec-a110-fe622b69ef79-operator-scripts\") pod \"root-account-create-update-cdc4t\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.554038 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbm4m\" (UniqueName: \"kubernetes.io/projected/5d523ed0-183e-4bec-a110-fe622b69ef79-kube-api-access-jbm4m\") pod \"root-account-create-update-cdc4t\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.556527 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d523ed0-183e-4bec-a110-fe622b69ef79-operator-scripts\") pod \"root-account-create-update-cdc4t\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.570012 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbm4m\" (UniqueName: \"kubernetes.io/projected/5d523ed0-183e-4bec-a110-fe622b69ef79-kube-api-access-jbm4m\") pod \"root-account-create-update-cdc4t\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.585701 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:02 crc kubenswrapper[4883]: I0310 09:22:02.370324 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:22:02 crc kubenswrapper[4883]: I0310 09:22:02.377621 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:22:02 crc kubenswrapper[4883]: I0310 09:22:02.653979 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.164190 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.293918 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-scripts\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.293991 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-dispersionconf\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.294018 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s66g\" (UniqueName: \"kubernetes.io/projected/cbe93226-96c7-4854-abdc-4afe54ad7ad5-kube-api-access-9s66g\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.294059 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-swiftconf\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.294147 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-combined-ca-bundle\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.294261 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-ring-data-devices\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.294385 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cbe93226-96c7-4854-abdc-4afe54ad7ad5-etc-swift\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.295219 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe93226-96c7-4854-abdc-4afe54ad7ad5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.295642 4883 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cbe93226-96c7-4854-abdc-4afe54ad7ad5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.296405 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.301017 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe93226-96c7-4854-abdc-4afe54ad7ad5-kube-api-access-9s66g" (OuterVolumeSpecName: "kube-api-access-9s66g") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "kube-api-access-9s66g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.304585 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.312802 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-scripts" (OuterVolumeSpecName: "scripts") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.315200 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.317263 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.398326 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.398363 4883 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.398379 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s66g\" (UniqueName: \"kubernetes.io/projected/cbe93226-96c7-4854-abdc-4afe54ad7ad5-kube-api-access-9s66g\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.398389 4883 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.398400 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.398411 4883 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.486175 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07","Type":"ContainerStarted","Data":"554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5"} Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.486826 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.488257 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n4vhh" event={"ID":"cbe93226-96c7-4854-abdc-4afe54ad7ad5","Type":"ContainerDied","Data":"11c349d8ba3b1aaf24065124e390760d2cc6670f859985a574e75a6f9d822d8c"} Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.488317 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11c349d8ba3b1aaf24065124e390760d2cc6670f859985a574e75a6f9d822d8c" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.488333 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.493133 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdb6ba72-d1c8-4022-9029-2e18784e1139","Type":"ContainerStarted","Data":"6b536f8286ba8f67ed7799e690ba32685de9ac7a70392f9e12efea3132a20a4e"} Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.493394 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.528210 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.965400521 podStartE2EDuration="55.528194601s" podCreationTimestamp="2026-03-10 09:21:08 +0000 UTC" firstStartedPulling="2026-03-10 09:21:10.207186917 +0000 UTC m=+1056.462084806" lastFinishedPulling="2026-03-10 09:21:25.769980996 +0000 UTC m=+1072.024878886" observedRunningTime="2026-03-10 09:22:03.513306147 +0000 UTC m=+1109.768204036" watchObservedRunningTime="2026-03-10 09:22:03.528194601 +0000 UTC m=+1109.783092490" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.549876 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.057129204 podStartE2EDuration="55.549856418s" podCreationTimestamp="2026-03-10 09:21:08 +0000 UTC" firstStartedPulling="2026-03-10 09:21:10.409462633 +0000 UTC m=+1056.664360522" lastFinishedPulling="2026-03-10 09:21:25.902189847 +0000 UTC m=+1072.157087736" observedRunningTime="2026-03-10 09:22:03.537217365 +0000 UTC m=+1109.792115255" watchObservedRunningTime="2026-03-10 09:22:03.549856418 +0000 UTC m=+1109.804754308" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.560997 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cdc4t"] Mar 10 09:22:03 crc kubenswrapper[4883]: W0310 09:22:03.578964 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d523ed0_183e_4bec_a110_fe622b69ef79.slice/crio-35575a977905ac6263222ca887674a296d87a0c48aab15cc70d1a61b9ed7b6ce WatchSource:0}: Error finding container 35575a977905ac6263222ca887674a296d87a0c48aab15cc70d1a61b9ed7b6ce: Status 404 returned error can't find the container with id 35575a977905ac6263222ca887674a296d87a0c48aab15cc70d1a61b9ed7b6ce Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.588518 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 09:22:03 crc kubenswrapper[4883]: W0310 09:22:03.588643 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39fdf41f_a914_4d0f_8d0c_5e378567a2db.slice/crio-0e48c8f74b7202798c3d97547fdee7356db2adb79887d4eb2a54a8474d824011 WatchSource:0}: Error finding container 0e48c8f74b7202798c3d97547fdee7356db2adb79887d4eb2a54a8474d824011: Status 404 returned error can't find the container with id 0e48c8f74b7202798c3d97547fdee7356db2adb79887d4eb2a54a8474d824011 Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.973717 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552242-kz9jr"] Mar 10 09:22:03 crc kubenswrapper[4883]: W0310 09:22:03.977447 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12cada45_6ba5_4db1_9a13_3de652b390bb.slice/crio-0851aca758d59994ecfe5174aff7960510a4cfe47e7e41268f80ccf57084995a WatchSource:0}: Error finding container 0851aca758d59994ecfe5174aff7960510a4cfe47e7e41268f80ccf57084995a: Status 404 returned error can't find the container with id 0851aca758d59994ecfe5174aff7960510a4cfe47e7e41268f80ccf57084995a Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.507206 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"0e48c8f74b7202798c3d97547fdee7356db2adb79887d4eb2a54a8474d824011"} Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.511983 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xtrqg" event={"ID":"d5485539-c722-477d-b595-649e07eac50e","Type":"ContainerStarted","Data":"a4591a3c1d7b1279937a98c799c83abe99429a53d01680d0592b50d1d359cd42"} Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.516801 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" event={"ID":"12cada45-6ba5-4db1-9a13-3de652b390bb","Type":"ContainerStarted","Data":"0851aca758d59994ecfe5174aff7960510a4cfe47e7e41268f80ccf57084995a"} Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.519886 4883 generic.go:334] "Generic (PLEG): container finished" podID="5d523ed0-183e-4bec-a110-fe622b69ef79" containerID="affbbf9bc93bb1cdc534fd16ed32d4696b867f4c70c0f6fa49bc5b18c4e55f72" exitCode=0 Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.519953 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cdc4t" event={"ID":"5d523ed0-183e-4bec-a110-fe622b69ef79","Type":"ContainerDied","Data":"affbbf9bc93bb1cdc534fd16ed32d4696b867f4c70c0f6fa49bc5b18c4e55f72"} Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.520033 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cdc4t" event={"ID":"5d523ed0-183e-4bec-a110-fe622b69ef79","Type":"ContainerStarted","Data":"35575a977905ac6263222ca887674a296d87a0c48aab15cc70d1a61b9ed7b6ce"} Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.535735 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-xtrqg" podStartSLOduration=4.637286189 podStartE2EDuration="16.535724759s" podCreationTimestamp="2026-03-10 09:21:48 +0000 UTC" firstStartedPulling="2026-03-10 09:21:51.167767727 +0000 UTC m=+1097.422665615" lastFinishedPulling="2026-03-10 09:22:03.066206297 +0000 UTC m=+1109.321104185" observedRunningTime="2026-03-10 09:22:04.530584075 +0000 UTC m=+1110.785481964" watchObservedRunningTime="2026-03-10 09:22:04.535724759 +0000 UTC m=+1110.790622648" Mar 10 09:22:05 crc kubenswrapper[4883]: I0310 09:22:05.542896 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" event={"ID":"12cada45-6ba5-4db1-9a13-3de652b390bb","Type":"ContainerStarted","Data":"ed2fdfaa94f84ea6d60f3a225de89340f67d80aec7fa6c18b378eb49e482b9b8"} Mar 10 09:22:05 crc kubenswrapper[4883]: I0310 09:22:05.564608 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" podStartSLOduration=4.555954675 podStartE2EDuration="5.564586019s" podCreationTimestamp="2026-03-10 09:22:00 +0000 UTC" firstStartedPulling="2026-03-10 09:22:03.980417242 +0000 UTC m=+1110.235315141" lastFinishedPulling="2026-03-10 09:22:04.989048596 +0000 UTC m=+1111.243946485" observedRunningTime="2026-03-10 09:22:05.560999166 +0000 UTC m=+1111.815897055" watchObservedRunningTime="2026-03-10 09:22:05.564586019 +0000 UTC m=+1111.819483908" Mar 10 09:22:06 crc kubenswrapper[4883]: I0310 09:22:06.547771 4883 generic.go:334] "Generic (PLEG): container finished" podID="12cada45-6ba5-4db1-9a13-3de652b390bb" containerID="ed2fdfaa94f84ea6d60f3a225de89340f67d80aec7fa6c18b378eb49e482b9b8" exitCode=0 Mar 10 09:22:06 crc kubenswrapper[4883]: I0310 09:22:06.547834 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" event={"ID":"12cada45-6ba5-4db1-9a13-3de652b390bb","Type":"ContainerDied","Data":"ed2fdfaa94f84ea6d60f3a225de89340f67d80aec7fa6c18b378eb49e482b9b8"} Mar 10 09:22:08 crc kubenswrapper[4883]: I0310 09:22:08.858351 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lb2z9" podUID="6691939e-adb0-420c-bf9e-f4a9b670c83b" containerName="ovn-controller" probeResult="failure" output=< Mar 10 09:22:08 crc kubenswrapper[4883]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 09:22:08 crc kubenswrapper[4883]: > Mar 10 09:22:08 crc kubenswrapper[4883]: I0310 09:22:08.858391 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:22:08 crc kubenswrapper[4883]: I0310 09:22:08.862379 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.074372 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lb2z9-config-xlg86"] Mar 10 09:22:09 crc kubenswrapper[4883]: E0310 09:22:09.075349 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe93226-96c7-4854-abdc-4afe54ad7ad5" containerName="swift-ring-rebalance" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.075380 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe93226-96c7-4854-abdc-4afe54ad7ad5" containerName="swift-ring-rebalance" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.075870 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe93226-96c7-4854-abdc-4afe54ad7ad5" containerName="swift-ring-rebalance" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.076778 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.083208 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.104526 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lb2z9-config-xlg86"] Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.206051 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run-ovn\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.206126 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4jk7\" (UniqueName: \"kubernetes.io/projected/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-kube-api-access-c4jk7\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.206490 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-additional-scripts\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.206710 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.206767 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-log-ovn\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.206998 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-scripts\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.309643 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4jk7\" (UniqueName: \"kubernetes.io/projected/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-kube-api-access-c4jk7\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.309754 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-additional-scripts\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.309810 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.309838 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-log-ovn\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.309901 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-scripts\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.309965 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run-ovn\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.310202 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run-ovn\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.310221 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-log-ovn\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.310220 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.310799 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-additional-scripts\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.312109 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-scripts\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.331632 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4jk7\" (UniqueName: \"kubernetes.io/projected/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-kube-api-access-c4jk7\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.392250 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.824514 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.828679 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.922875 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv225\" (UniqueName: \"kubernetes.io/projected/12cada45-6ba5-4db1-9a13-3de652b390bb-kube-api-access-hv225\") pod \"12cada45-6ba5-4db1-9a13-3de652b390bb\" (UID: \"12cada45-6ba5-4db1-9a13-3de652b390bb\") " Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.923253 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d523ed0-183e-4bec-a110-fe622b69ef79-operator-scripts\") pod \"5d523ed0-183e-4bec-a110-fe622b69ef79\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.923280 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbm4m\" (UniqueName: \"kubernetes.io/projected/5d523ed0-183e-4bec-a110-fe622b69ef79-kube-api-access-jbm4m\") pod \"5d523ed0-183e-4bec-a110-fe622b69ef79\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.926974 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d523ed0-183e-4bec-a110-fe622b69ef79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d523ed0-183e-4bec-a110-fe622b69ef79" (UID: "5d523ed0-183e-4bec-a110-fe622b69ef79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.927075 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d523ed0-183e-4bec-a110-fe622b69ef79-kube-api-access-jbm4m" (OuterVolumeSpecName: "kube-api-access-jbm4m") pod "5d523ed0-183e-4bec-a110-fe622b69ef79" (UID: "5d523ed0-183e-4bec-a110-fe622b69ef79"). InnerVolumeSpecName "kube-api-access-jbm4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.933619 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12cada45-6ba5-4db1-9a13-3de652b390bb-kube-api-access-hv225" (OuterVolumeSpecName: "kube-api-access-hv225") pod "12cada45-6ba5-4db1-9a13-3de652b390bb" (UID: "12cada45-6ba5-4db1-9a13-3de652b390bb"). InnerVolumeSpecName "kube-api-access-hv225". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.025508 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d523ed0-183e-4bec-a110-fe622b69ef79-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.025538 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbm4m\" (UniqueName: \"kubernetes.io/projected/5d523ed0-183e-4bec-a110-fe622b69ef79-kube-api-access-jbm4m\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.025552 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv225\" (UniqueName: \"kubernetes.io/projected/12cada45-6ba5-4db1-9a13-3de652b390bb-kube-api-access-hv225\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.224503 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lb2z9-config-xlg86"] Mar 10 09:22:10 crc kubenswrapper[4883]: W0310 09:22:10.239996 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0ae19cc_a5a6_40c2_81c1_c85d80abf698.slice/crio-97319dd87f2b38ac3b029fb472148ced2fb64f21995e1465c95aad375bf46b1e WatchSource:0}: Error finding container 97319dd87f2b38ac3b029fb472148ced2fb64f21995e1465c95aad375bf46b1e: Status 404 returned error can't find the container with id 97319dd87f2b38ac3b029fb472148ced2fb64f21995e1465c95aad375bf46b1e Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.580652 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lb2z9-config-xlg86" event={"ID":"e0ae19cc-a5a6-40c2-81c1-c85d80abf698","Type":"ContainerStarted","Data":"969226cfd74c44f03958486fd0daa3fd76ae699fc5b9689135438015e7c4fbc5"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.581162 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lb2z9-config-xlg86" event={"ID":"e0ae19cc-a5a6-40c2-81c1-c85d80abf698","Type":"ContainerStarted","Data":"97319dd87f2b38ac3b029fb472148ced2fb64f21995e1465c95aad375bf46b1e"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.583590 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.583567 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cdc4t" event={"ID":"5d523ed0-183e-4bec-a110-fe622b69ef79","Type":"ContainerDied","Data":"35575a977905ac6263222ca887674a296d87a0c48aab15cc70d1a61b9ed7b6ce"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.583716 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35575a977905ac6263222ca887674a296d87a0c48aab15cc70d1a61b9ed7b6ce" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.591751 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"cde66402e8a113a08f3939caaddb582cd76af576918a9313978e4264e146e64c"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.591783 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"5e5101b2c9fe4b924daf78660657a6f7ff64d89fc1dc8ea69534d49f205a3790"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.591794 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"4301134820ef18f16447d81d9bcd70b68e4721bd974daacf289ff2cae961b4c2"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.591804 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"60f26fdfcd1fd4afc796a7cd4e553b7137464997d77a8e0c525b10010919d749"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.593723 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" event={"ID":"12cada45-6ba5-4db1-9a13-3de652b390bb","Type":"ContainerDied","Data":"0851aca758d59994ecfe5174aff7960510a4cfe47e7e41268f80ccf57084995a"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.593747 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0851aca758d59994ecfe5174aff7960510a4cfe47e7e41268f80ccf57084995a" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.593935 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.599827 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lb2z9-config-xlg86" podStartSLOduration=1.599809065 podStartE2EDuration="1.599809065s" podCreationTimestamp="2026-03-10 09:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:10.597072596 +0000 UTC m=+1116.851970484" watchObservedRunningTime="2026-03-10 09:22:10.599809065 +0000 UTC m=+1116.854706955" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.905021 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552236-hdd6d"] Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.912624 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552236-hdd6d"] Mar 10 09:22:11 crc kubenswrapper[4883]: I0310 09:22:11.605032 4883 generic.go:334] "Generic (PLEG): container finished" podID="e0ae19cc-a5a6-40c2-81c1-c85d80abf698" containerID="969226cfd74c44f03958486fd0daa3fd76ae699fc5b9689135438015e7c4fbc5" exitCode=0 Mar 10 09:22:11 crc kubenswrapper[4883]: I0310 09:22:11.605162 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lb2z9-config-xlg86" event={"ID":"e0ae19cc-a5a6-40c2-81c1-c85d80abf698","Type":"ContainerDied","Data":"969226cfd74c44f03958486fd0daa3fd76ae699fc5b9689135438015e7c4fbc5"} Mar 10 09:22:11 crc kubenswrapper[4883]: I0310 09:22:11.607955 4883 generic.go:334] "Generic (PLEG): container finished" podID="d5485539-c722-477d-b595-649e07eac50e" containerID="a4591a3c1d7b1279937a98c799c83abe99429a53d01680d0592b50d1d359cd42" exitCode=0 Mar 10 09:22:11 crc kubenswrapper[4883]: I0310 09:22:11.607993 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xtrqg" event={"ID":"d5485539-c722-477d-b595-649e07eac50e","Type":"ContainerDied","Data":"a4591a3c1d7b1279937a98c799c83abe99429a53d01680d0592b50d1d359cd42"} Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.107971 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2a502d2-d219-4f01-aebc-f27fb7766458" path="/var/lib/kubelet/pods/a2a502d2-d219-4f01-aebc-f27fb7766458/volumes" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.620288 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"20a3c9c9b07e0b3bbf86ab05587abec2ae8fb8c885231972c34fd3870d545603"} Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.620673 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"f840485d0b5166b267e837f8a420a390e5e2179416cc947e579e7af464d5d9fd"} Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.620693 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"6d07d8124acd370018c4e4fc31f9f249ffaab52b09d4c2e03bac9d7bae73169c"} Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.620703 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"5ca3f23453da824e8b3a92885593c23039499f92e18020946ff11d3ad0255a92"} Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.887442 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.922190 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xtrqg" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989132 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run\") pod \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989258 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-scripts\") pod \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989253 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run" (OuterVolumeSpecName: "var-run") pod "e0ae19cc-a5a6-40c2-81c1-c85d80abf698" (UID: "e0ae19cc-a5a6-40c2-81c1-c85d80abf698"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989294 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4jk7\" (UniqueName: \"kubernetes.io/projected/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-kube-api-access-c4jk7\") pod \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989371 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-log-ovn\") pod \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989410 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run-ovn\") pod \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989484 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e0ae19cc-a5a6-40c2-81c1-c85d80abf698" (UID: "e0ae19cc-a5a6-40c2-81c1-c85d80abf698"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989503 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-additional-scripts\") pod \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990009 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e0ae19cc-a5a6-40c2-81c1-c85d80abf698" (UID: "e0ae19cc-a5a6-40c2-81c1-c85d80abf698"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990214 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-scripts" (OuterVolumeSpecName: "scripts") pod "e0ae19cc-a5a6-40c2-81c1-c85d80abf698" (UID: "e0ae19cc-a5a6-40c2-81c1-c85d80abf698"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990305 4883 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990331 4883 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990344 4883 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990351 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990454 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e0ae19cc-a5a6-40c2-81c1-c85d80abf698" (UID: "e0ae19cc-a5a6-40c2-81c1-c85d80abf698"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.993837 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-kube-api-access-c4jk7" (OuterVolumeSpecName: "kube-api-access-c4jk7") pod "e0ae19cc-a5a6-40c2-81c1-c85d80abf698" (UID: "e0ae19cc-a5a6-40c2-81c1-c85d80abf698"). InnerVolumeSpecName "kube-api-access-c4jk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.090993 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9qnd\" (UniqueName: \"kubernetes.io/projected/d5485539-c722-477d-b595-649e07eac50e-kube-api-access-s9qnd\") pod \"d5485539-c722-477d-b595-649e07eac50e\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.091202 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-combined-ca-bundle\") pod \"d5485539-c722-477d-b595-649e07eac50e\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.091297 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-config-data\") pod \"d5485539-c722-477d-b595-649e07eac50e\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.091335 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-db-sync-config-data\") pod \"d5485539-c722-477d-b595-649e07eac50e\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.091815 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4jk7\" (UniqueName: \"kubernetes.io/projected/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-kube-api-access-c4jk7\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.091839 4883 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.094652 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d5485539-c722-477d-b595-649e07eac50e" (UID: "d5485539-c722-477d-b595-649e07eac50e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.096415 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5485539-c722-477d-b595-649e07eac50e-kube-api-access-s9qnd" (OuterVolumeSpecName: "kube-api-access-s9qnd") pod "d5485539-c722-477d-b595-649e07eac50e" (UID: "d5485539-c722-477d-b595-649e07eac50e"). InnerVolumeSpecName "kube-api-access-s9qnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.111372 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5485539-c722-477d-b595-649e07eac50e" (UID: "d5485539-c722-477d-b595-649e07eac50e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.123613 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-config-data" (OuterVolumeSpecName: "config-data") pod "d5485539-c722-477d-b595-649e07eac50e" (UID: "d5485539-c722-477d-b595-649e07eac50e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.194375 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.194781 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.194799 4883 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.194813 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9qnd\" (UniqueName: \"kubernetes.io/projected/d5485539-c722-477d-b595-649e07eac50e-kube-api-access-s9qnd\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.327017 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lb2z9-config-xlg86"] Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.340703 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lb2z9-config-xlg86"] Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.632344 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97319dd87f2b38ac3b029fb472148ced2fb64f21995e1465c95aad375bf46b1e" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.632449 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.635051 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xtrqg" event={"ID":"d5485539-c722-477d-b595-649e07eac50e","Type":"ContainerDied","Data":"2cb0092f81baba80b5e32ffbad7cc2ef9d1d4d56b87128258545a452766054f7"} Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.635125 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cb0092f81baba80b5e32ffbad7cc2ef9d1d4d56b87128258545a452766054f7" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.635151 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xtrqg" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.921988 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lb2z9" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.999232 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-l9wzv"] Mar 10 09:22:14 crc kubenswrapper[4883]: E0310 09:22:14.009917 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12cada45-6ba5-4db1-9a13-3de652b390bb" containerName="oc" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.010458 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="12cada45-6ba5-4db1-9a13-3de652b390bb" containerName="oc" Mar 10 09:22:14 crc kubenswrapper[4883]: E0310 09:22:14.010584 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5485539-c722-477d-b595-649e07eac50e" containerName="glance-db-sync" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.010646 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5485539-c722-477d-b595-649e07eac50e" containerName="glance-db-sync" Mar 10 09:22:14 crc kubenswrapper[4883]: E0310 09:22:14.010718 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ae19cc-a5a6-40c2-81c1-c85d80abf698" containerName="ovn-config" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.010766 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ae19cc-a5a6-40c2-81c1-c85d80abf698" containerName="ovn-config" Mar 10 09:22:14 crc kubenswrapper[4883]: E0310 09:22:14.010821 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d523ed0-183e-4bec-a110-fe622b69ef79" containerName="mariadb-account-create-update" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.010875 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d523ed0-183e-4bec-a110-fe622b69ef79" containerName="mariadb-account-create-update" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.011137 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5485539-c722-477d-b595-649e07eac50e" containerName="glance-db-sync" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.011196 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ae19cc-a5a6-40c2-81c1-c85d80abf698" containerName="ovn-config" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.011250 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d523ed0-183e-4bec-a110-fe622b69ef79" containerName="mariadb-account-create-update" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.011325 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="12cada45-6ba5-4db1-9a13-3de652b390bb" containerName="oc" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.012301 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.019137 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-l9wzv"] Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.102778 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ae19cc-a5a6-40c2-81c1-c85d80abf698" path="/var/lib/kubelet/pods/e0ae19cc-a5a6-40c2-81c1-c85d80abf698/volumes" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.111200 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.111412 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5d6b\" (UniqueName: \"kubernetes.io/projected/949789c9-e015-4172-90f8-9a97607f3cd0-kube-api-access-x5d6b\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.111517 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-config\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.111664 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.111791 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.214659 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.214843 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.214891 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5d6b\" (UniqueName: \"kubernetes.io/projected/949789c9-e015-4172-90f8-9a97607f3cd0-kube-api-access-x5d6b\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.214944 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-config\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.214980 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.215690 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.215765 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.216160 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.216919 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-config\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.233347 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5d6b\" (UniqueName: \"kubernetes.io/projected/949789c9-e015-4172-90f8-9a97607f3cd0-kube-api-access-x5d6b\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.328938 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.659330 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"f7334b9c5072a37af68e44df13c967bdc36c3bc3e3ad13a8af7e3548b95adf0e"} Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.659709 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"f163ae110b1cda0c4cf72333733a9a16fbb6fc3fcf03143dfd294ff6b6f85791"} Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.757084 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-l9wzv"] Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.673009 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"dddd2526d3626ca272ce3dde931298a46bd49bc7885f7181adfdfff8ef818b33"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.673392 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"0c0337430709f67cbf74510fb4fc66182c5a0926ade86b0871b6a9763923b8ef"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.673406 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"b4337a1a9bdcb12179e815ec153ed87760dff0dd587c319ff7e107ff0c109245"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.673415 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"2254f6d1873b619efb991750d1877e62722be76753968c3e9c37772f04278339"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.673424 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"ea5fd364998bd0d772f65137af11f496dc0e207c51e0d266e7a6fbb60f23707c"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.684319 4883 generic.go:334] "Generic (PLEG): container finished" podID="949789c9-e015-4172-90f8-9a97607f3cd0" containerID="bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48" exitCode=0 Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.684371 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" event={"ID":"949789c9-e015-4172-90f8-9a97607f3cd0","Type":"ContainerDied","Data":"bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.684400 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" event={"ID":"949789c9-e015-4172-90f8-9a97607f3cd0","Type":"ContainerStarted","Data":"d6379cc5f0f0d1e68546e9664112db8e5584a977dbf3b98f9500eb8f5898a6cd"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.714607 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.000563198 podStartE2EDuration="30.714589051s" podCreationTimestamp="2026-03-10 09:21:45 +0000 UTC" firstStartedPulling="2026-03-10 09:22:03.591952289 +0000 UTC m=+1109.846850178" lastFinishedPulling="2026-03-10 09:22:14.305978142 +0000 UTC m=+1120.560876031" observedRunningTime="2026-03-10 09:22:15.706317805 +0000 UTC m=+1121.961215694" watchObservedRunningTime="2026-03-10 09:22:15.714589051 +0000 UTC m=+1121.969486941" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.045293 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-l9wzv"] Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.069169 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-scjqt"] Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.070802 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.072447 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.090435 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-scjqt"] Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.155448 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.155771 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpsnd\" (UniqueName: \"kubernetes.io/projected/631a0fc2-de6d-4778-bce2-46b69c306e44-kube-api-access-dpsnd\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.155862 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.156161 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.156225 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-config\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.156255 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.258336 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.258395 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-config\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.258421 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.258453 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.258518 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpsnd\" (UniqueName: \"kubernetes.io/projected/631a0fc2-de6d-4778-bce2-46b69c306e44-kube-api-access-dpsnd\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.258551 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.259640 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.259892 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.260004 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-config\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.260129 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.260375 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.276892 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpsnd\" (UniqueName: \"kubernetes.io/projected/631a0fc2-de6d-4778-bce2-46b69c306e44-kube-api-access-dpsnd\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.385368 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.693163 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" event={"ID":"949789c9-e015-4172-90f8-9a97607f3cd0","Type":"ContainerStarted","Data":"09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225"} Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.693523 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.709863 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" podStartSLOduration=3.70984843 podStartE2EDuration="3.70984843s" podCreationTimestamp="2026-03-10 09:22:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:16.706515404 +0000 UTC m=+1122.961413294" watchObservedRunningTime="2026-03-10 09:22:16.70984843 +0000 UTC m=+1122.964746319" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.799037 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-scjqt"] Mar 10 09:22:16 crc kubenswrapper[4883]: W0310 09:22:16.801221 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod631a0fc2_de6d_4778_bce2_46b69c306e44.slice/crio-5064013cb3a4c978ab4c5b11395e9bcf9761e80e64e6533a75fe3d587b5a3d7c WatchSource:0}: Error finding container 5064013cb3a4c978ab4c5b11395e9bcf9761e80e64e6533a75fe3d587b5a3d7c: Status 404 returned error can't find the container with id 5064013cb3a4c978ab4c5b11395e9bcf9761e80e64e6533a75fe3d587b5a3d7c Mar 10 09:22:17 crc kubenswrapper[4883]: I0310 09:22:17.704180 4883 generic.go:334] "Generic (PLEG): container finished" podID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerID="e2b7245fe34a086ba653b170ba754d14297a7a81b375fe409da9fc2787c69d3f" exitCode=0 Mar 10 09:22:17 crc kubenswrapper[4883]: I0310 09:22:17.704233 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" event={"ID":"631a0fc2-de6d-4778-bce2-46b69c306e44","Type":"ContainerDied","Data":"e2b7245fe34a086ba653b170ba754d14297a7a81b375fe409da9fc2787c69d3f"} Mar 10 09:22:17 crc kubenswrapper[4883]: I0310 09:22:17.704665 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" event={"ID":"631a0fc2-de6d-4778-bce2-46b69c306e44","Type":"ContainerStarted","Data":"5064013cb3a4c978ab4c5b11395e9bcf9761e80e64e6533a75fe3d587b5a3d7c"} Mar 10 09:22:17 crc kubenswrapper[4883]: I0310 09:22:17.704745 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" containerName="dnsmasq-dns" containerID="cri-o://09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225" gracePeriod=10 Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.049649 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.203295 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-nb\") pod \"949789c9-e015-4172-90f8-9a97607f3cd0\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.203360 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-config\") pod \"949789c9-e015-4172-90f8-9a97607f3cd0\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.203512 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5d6b\" (UniqueName: \"kubernetes.io/projected/949789c9-e015-4172-90f8-9a97607f3cd0-kube-api-access-x5d6b\") pod \"949789c9-e015-4172-90f8-9a97607f3cd0\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.203661 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-sb\") pod \"949789c9-e015-4172-90f8-9a97607f3cd0\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.203747 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-dns-svc\") pod \"949789c9-e015-4172-90f8-9a97607f3cd0\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.208550 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949789c9-e015-4172-90f8-9a97607f3cd0-kube-api-access-x5d6b" (OuterVolumeSpecName: "kube-api-access-x5d6b") pod "949789c9-e015-4172-90f8-9a97607f3cd0" (UID: "949789c9-e015-4172-90f8-9a97607f3cd0"). InnerVolumeSpecName "kube-api-access-x5d6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.231155 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "949789c9-e015-4172-90f8-9a97607f3cd0" (UID: "949789c9-e015-4172-90f8-9a97607f3cd0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.234329 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "949789c9-e015-4172-90f8-9a97607f3cd0" (UID: "949789c9-e015-4172-90f8-9a97607f3cd0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.235745 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "949789c9-e015-4172-90f8-9a97607f3cd0" (UID: "949789c9-e015-4172-90f8-9a97607f3cd0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.236400 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-config" (OuterVolumeSpecName: "config") pod "949789c9-e015-4172-90f8-9a97607f3cd0" (UID: "949789c9-e015-4172-90f8-9a97607f3cd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.307846 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.307876 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.307891 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.307903 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5d6b\" (UniqueName: \"kubernetes.io/projected/949789c9-e015-4172-90f8-9a97607f3cd0-kube-api-access-x5d6b\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.307913 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.715626 4883 generic.go:334] "Generic (PLEG): container finished" podID="949789c9-e015-4172-90f8-9a97607f3cd0" containerID="09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225" exitCode=0 Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.715701 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.715729 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" event={"ID":"949789c9-e015-4172-90f8-9a97607f3cd0","Type":"ContainerDied","Data":"09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225"} Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.715795 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" event={"ID":"949789c9-e015-4172-90f8-9a97607f3cd0","Type":"ContainerDied","Data":"d6379cc5f0f0d1e68546e9664112db8e5584a977dbf3b98f9500eb8f5898a6cd"} Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.715825 4883 scope.go:117] "RemoveContainer" containerID="09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.718070 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" event={"ID":"631a0fc2-de6d-4778-bce2-46b69c306e44","Type":"ContainerStarted","Data":"751aa4bf679ccd2fb9310821003d8071e637bbcc3be2f295875a2abfad364ce2"} Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.718236 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.733604 4883 scope.go:117] "RemoveContainer" containerID="bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.756945 4883 scope.go:117] "RemoveContainer" containerID="09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225" Mar 10 09:22:18 crc kubenswrapper[4883]: E0310 09:22:18.757360 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225\": container with ID starting with 09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225 not found: ID does not exist" containerID="09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.757400 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225"} err="failed to get container status \"09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225\": rpc error: code = NotFound desc = could not find container \"09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225\": container with ID starting with 09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225 not found: ID does not exist" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.757426 4883 scope.go:117] "RemoveContainer" containerID="bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48" Mar 10 09:22:18 crc kubenswrapper[4883]: E0310 09:22:18.758092 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48\": container with ID starting with bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48 not found: ID does not exist" containerID="bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.758131 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48"} err="failed to get container status \"bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48\": rpc error: code = NotFound desc = could not find container \"bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48\": container with ID starting with bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48 not found: ID does not exist" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.769945 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" podStartSLOduration=2.769931593 podStartE2EDuration="2.769931593s" podCreationTimestamp="2026-03-10 09:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:18.766097163 +0000 UTC m=+1125.020995052" watchObservedRunningTime="2026-03-10 09:22:18.769931593 +0000 UTC m=+1125.024829482" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.784077 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-l9wzv"] Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.791414 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-l9wzv"] Mar 10 09:22:19 crc kubenswrapper[4883]: I0310 09:22:19.700712 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:22:19 crc kubenswrapper[4883]: I0310 09:22:19.975777 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 09:22:20 crc kubenswrapper[4883]: I0310 09:22:20.087738 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" path="/var/lib/kubelet/pods/949789c9-e015-4172-90f8-9a97607f3cd0/volumes" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.193745 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-8664j"] Mar 10 09:22:21 crc kubenswrapper[4883]: E0310 09:22:21.194329 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" containerName="dnsmasq-dns" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.194343 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" containerName="dnsmasq-dns" Mar 10 09:22:21 crc kubenswrapper[4883]: E0310 09:22:21.194352 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" containerName="init" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.194358 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" containerName="init" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.194559 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" containerName="dnsmasq-dns" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.195106 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.211720 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8664j"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.300152 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-wzxkv"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.301370 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.322565 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-27c8-account-create-update-9w2q5"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.323909 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.327752 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.337276 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wzxkv"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.341980 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-27c8-account-create-update-9w2q5"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.379824 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7aa202-c734-4333-a1de-1bdb39d59804-operator-scripts\") pod \"cinder-db-create-8664j\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.380040 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkbcf\" (UniqueName: \"kubernetes.io/projected/ed7aa202-c734-4333-a1de-1bdb39d59804-kube-api-access-kkbcf\") pod \"cinder-db-create-8664j\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.481535 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k94lm\" (UniqueName: \"kubernetes.io/projected/94df275b-e089-4e1f-8eac-e4806d2f1178-kube-api-access-k94lm\") pod \"barbican-db-create-wzxkv\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.481662 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7qwk\" (UniqueName: \"kubernetes.io/projected/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-kube-api-access-x7qwk\") pod \"cinder-27c8-account-create-update-9w2q5\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.481697 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94df275b-e089-4e1f-8eac-e4806d2f1178-operator-scripts\") pod \"barbican-db-create-wzxkv\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.481773 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7aa202-c734-4333-a1de-1bdb39d59804-operator-scripts\") pod \"cinder-db-create-8664j\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.482532 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7aa202-c734-4333-a1de-1bdb39d59804-operator-scripts\") pod \"cinder-db-create-8664j\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.482733 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkbcf\" (UniqueName: \"kubernetes.io/projected/ed7aa202-c734-4333-a1de-1bdb39d59804-kube-api-access-kkbcf\") pod \"cinder-db-create-8664j\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.483173 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-operator-scripts\") pod \"cinder-27c8-account-create-update-9w2q5\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.494115 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ef9a-account-create-update-4bwrd"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.495349 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.496962 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.514701 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hrq22"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.515879 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.522000 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ef9a-account-create-update-4bwrd"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.527208 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hrq22"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.529711 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkbcf\" (UniqueName: \"kubernetes.io/projected/ed7aa202-c734-4333-a1de-1bdb39d59804-kube-api-access-kkbcf\") pod \"cinder-db-create-8664j\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.565246 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-w5q6s"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.566461 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.571294 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.571772 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.571798 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.571787 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-92kmh" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.571911 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-w5q6s"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.584878 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k94lm\" (UniqueName: \"kubernetes.io/projected/94df275b-e089-4e1f-8eac-e4806d2f1178-kube-api-access-k94lm\") pod \"barbican-db-create-wzxkv\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.584932 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7qwk\" (UniqueName: \"kubernetes.io/projected/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-kube-api-access-x7qwk\") pod \"cinder-27c8-account-create-update-9w2q5\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.584963 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94df275b-e089-4e1f-8eac-e4806d2f1178-operator-scripts\") pod \"barbican-db-create-wzxkv\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.585145 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-operator-scripts\") pod \"cinder-27c8-account-create-update-9w2q5\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.585930 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-operator-scripts\") pod \"cinder-27c8-account-create-update-9w2q5\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.586039 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94df275b-e089-4e1f-8eac-e4806d2f1178-operator-scripts\") pod \"barbican-db-create-wzxkv\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.603181 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-fca8-account-create-update-7jkwx"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.604368 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.605861 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.610894 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-fca8-account-create-update-7jkwx"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.622387 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7qwk\" (UniqueName: \"kubernetes.io/projected/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-kube-api-access-x7qwk\") pod \"cinder-27c8-account-create-update-9w2q5\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.625117 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k94lm\" (UniqueName: \"kubernetes.io/projected/94df275b-e089-4e1f-8eac-e4806d2f1178-kube-api-access-k94lm\") pod \"barbican-db-create-wzxkv\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.635126 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.688325 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5vkl\" (UniqueName: \"kubernetes.io/projected/96942836-243a-48c5-be3d-5eb5e5f166d0-kube-api-access-s5vkl\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.688534 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24713bd6-5868-43ec-94ec-2371a49a0b88-operator-scripts\") pod \"neutron-db-create-hrq22\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.688788 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-config-data\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.688891 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-combined-ca-bundle\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.688947 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfmvc\" (UniqueName: \"kubernetes.io/projected/ed176738-d518-45e3-be47-3ace090d0e7a-kube-api-access-lfmvc\") pod \"neutron-ef9a-account-create-update-4bwrd\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.689066 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed176738-d518-45e3-be47-3ace090d0e7a-operator-scripts\") pod \"neutron-ef9a-account-create-update-4bwrd\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.689099 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt7rx\" (UniqueName: \"kubernetes.io/projected/24713bd6-5868-43ec-94ec-2371a49a0b88-kube-api-access-lt7rx\") pod \"neutron-db-create-hrq22\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.790876 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-config-data\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.790955 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a8b78f-e864-49d5-9dfb-aebd86741885-operator-scripts\") pod \"barbican-fca8-account-create-update-7jkwx\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.790990 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-combined-ca-bundle\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.791072 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfmvc\" (UniqueName: \"kubernetes.io/projected/ed176738-d518-45e3-be47-3ace090d0e7a-kube-api-access-lfmvc\") pod \"neutron-ef9a-account-create-update-4bwrd\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.791760 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6tlm\" (UniqueName: \"kubernetes.io/projected/07a8b78f-e864-49d5-9dfb-aebd86741885-kube-api-access-v6tlm\") pod \"barbican-fca8-account-create-update-7jkwx\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.791806 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed176738-d518-45e3-be47-3ace090d0e7a-operator-scripts\") pod \"neutron-ef9a-account-create-update-4bwrd\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.791827 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt7rx\" (UniqueName: \"kubernetes.io/projected/24713bd6-5868-43ec-94ec-2371a49a0b88-kube-api-access-lt7rx\") pod \"neutron-db-create-hrq22\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.791873 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5vkl\" (UniqueName: \"kubernetes.io/projected/96942836-243a-48c5-be3d-5eb5e5f166d0-kube-api-access-s5vkl\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.791900 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24713bd6-5868-43ec-94ec-2371a49a0b88-operator-scripts\") pod \"neutron-db-create-hrq22\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.792538 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24713bd6-5868-43ec-94ec-2371a49a0b88-operator-scripts\") pod \"neutron-db-create-hrq22\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.792794 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed176738-d518-45e3-be47-3ace090d0e7a-operator-scripts\") pod \"neutron-ef9a-account-create-update-4bwrd\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.796613 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-combined-ca-bundle\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.800656 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-config-data\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.807469 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt7rx\" (UniqueName: \"kubernetes.io/projected/24713bd6-5868-43ec-94ec-2371a49a0b88-kube-api-access-lt7rx\") pod \"neutron-db-create-hrq22\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.807966 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfmvc\" (UniqueName: \"kubernetes.io/projected/ed176738-d518-45e3-be47-3ace090d0e7a-kube-api-access-lfmvc\") pod \"neutron-ef9a-account-create-update-4bwrd\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.811551 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.812432 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5vkl\" (UniqueName: \"kubernetes.io/projected/96942836-243a-48c5-be3d-5eb5e5f166d0-kube-api-access-s5vkl\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.824863 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.858141 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.893741 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.898689 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6tlm\" (UniqueName: \"kubernetes.io/projected/07a8b78f-e864-49d5-9dfb-aebd86741885-kube-api-access-v6tlm\") pod \"barbican-fca8-account-create-update-7jkwx\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.898937 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a8b78f-e864-49d5-9dfb-aebd86741885-operator-scripts\") pod \"barbican-fca8-account-create-update-7jkwx\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.899635 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a8b78f-e864-49d5-9dfb-aebd86741885-operator-scripts\") pod \"barbican-fca8-account-create-update-7jkwx\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.924250 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.952286 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6tlm\" (UniqueName: \"kubernetes.io/projected/07a8b78f-e864-49d5-9dfb-aebd86741885-kube-api-access-v6tlm\") pod \"barbican-fca8-account-create-update-7jkwx\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.030247 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.147593 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-27c8-account-create-update-9w2q5"] Mar 10 09:22:22 crc kubenswrapper[4883]: W0310 09:22:22.176458 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc357ec28_9cec_42e8_9e4d_dc1fb9960bc7.slice/crio-3539e3cd3e3904f08f633f29769d6e5b99ec33a5b0a82fae13aeeab5f98d7f00 WatchSource:0}: Error finding container 3539e3cd3e3904f08f633f29769d6e5b99ec33a5b0a82fae13aeeab5f98d7f00: Status 404 returned error can't find the container with id 3539e3cd3e3904f08f633f29769d6e5b99ec33a5b0a82fae13aeeab5f98d7f00 Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.196064 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ef9a-account-create-update-4bwrd"] Mar 10 09:22:22 crc kubenswrapper[4883]: W0310 09:22:22.217244 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded176738_d518_45e3_be47_3ace090d0e7a.slice/crio-58b015606f21414ba323487161bd4b998e740b02422f73f204ccd263aafe0272 WatchSource:0}: Error finding container 58b015606f21414ba323487161bd4b998e740b02422f73f204ccd263aafe0272: Status 404 returned error can't find the container with id 58b015606f21414ba323487161bd4b998e740b02422f73f204ccd263aafe0272 Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.539306 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8664j"] Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.604838 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hrq22"] Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.651996 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-w5q6s"] Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.671048 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wzxkv"] Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.684242 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-fca8-account-create-update-7jkwx"] Mar 10 09:22:22 crc kubenswrapper[4883]: W0310 09:22:22.699766 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94df275b_e089_4e1f_8eac_e4806d2f1178.slice/crio-3ca9fa69a71dd94dd374770fb82f842201a6cc3e02147ecade5cd635caacef53 WatchSource:0}: Error finding container 3ca9fa69a71dd94dd374770fb82f842201a6cc3e02147ecade5cd635caacef53: Status 404 returned error can't find the container with id 3ca9fa69a71dd94dd374770fb82f842201a6cc3e02147ecade5cd635caacef53 Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.761105 4883 generic.go:334] "Generic (PLEG): container finished" podID="ed176738-d518-45e3-be47-3ace090d0e7a" containerID="c7587acba5dab37b49dbdd81924e01184e73978fd599f62b1af6671e7ae50b6e" exitCode=0 Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.761186 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ef9a-account-create-update-4bwrd" event={"ID":"ed176738-d518-45e3-be47-3ace090d0e7a","Type":"ContainerDied","Data":"c7587acba5dab37b49dbdd81924e01184e73978fd599f62b1af6671e7ae50b6e"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.761220 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ef9a-account-create-update-4bwrd" event={"ID":"ed176738-d518-45e3-be47-3ace090d0e7a","Type":"ContainerStarted","Data":"58b015606f21414ba323487161bd4b998e740b02422f73f204ccd263aafe0272"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.762862 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fca8-account-create-update-7jkwx" event={"ID":"07a8b78f-e864-49d5-9dfb-aebd86741885","Type":"ContainerStarted","Data":"4564e46ddd506f32a201d9f1057e104c34fe9ab13966b5aa8a79230af780d4c0"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.765293 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w5q6s" event={"ID":"96942836-243a-48c5-be3d-5eb5e5f166d0","Type":"ContainerStarted","Data":"be6c6a8d52e4e43dcf7deb7dc6d88e3b0cf1816a00b3324c09c224772a4f6fa7"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.767194 4883 generic.go:334] "Generic (PLEG): container finished" podID="c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" containerID="a48c527e869a78aa5301ce2ab9632963d3e2d800250d247df83963b7da9be724" exitCode=0 Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.767271 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-27c8-account-create-update-9w2q5" event={"ID":"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7","Type":"ContainerDied","Data":"a48c527e869a78aa5301ce2ab9632963d3e2d800250d247df83963b7da9be724"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.767307 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-27c8-account-create-update-9w2q5" event={"ID":"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7","Type":"ContainerStarted","Data":"3539e3cd3e3904f08f633f29769d6e5b99ec33a5b0a82fae13aeeab5f98d7f00"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.768611 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wzxkv" event={"ID":"94df275b-e089-4e1f-8eac-e4806d2f1178","Type":"ContainerStarted","Data":"3ca9fa69a71dd94dd374770fb82f842201a6cc3e02147ecade5cd635caacef53"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.780417 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hrq22" event={"ID":"24713bd6-5868-43ec-94ec-2371a49a0b88","Type":"ContainerStarted","Data":"dbf762b1ecb64eb779af79450c42147ea6c3038641d43eb500690891663077b3"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.782031 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8664j" event={"ID":"ed7aa202-c734-4333-a1de-1bdb39d59804","Type":"ContainerStarted","Data":"9b5b8b914e9f13ae13ab5c11566677509e8d1e77ce6e654bf5a93d7fab30b16c"} Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.791813 4883 generic.go:334] "Generic (PLEG): container finished" podID="94df275b-e089-4e1f-8eac-e4806d2f1178" containerID="98460e32a504c0e3ede8a9fd544c2c34e4954a1cfed507bb532c53cf560762fd" exitCode=0 Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.791908 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wzxkv" event={"ID":"94df275b-e089-4e1f-8eac-e4806d2f1178","Type":"ContainerDied","Data":"98460e32a504c0e3ede8a9fd544c2c34e4954a1cfed507bb532c53cf560762fd"} Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.794420 4883 generic.go:334] "Generic (PLEG): container finished" podID="24713bd6-5868-43ec-94ec-2371a49a0b88" containerID="5dba08c9d93be005c0c85060006f6110a86c429508b6e36e94151d58e533d961" exitCode=0 Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.794499 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hrq22" event={"ID":"24713bd6-5868-43ec-94ec-2371a49a0b88","Type":"ContainerDied","Data":"5dba08c9d93be005c0c85060006f6110a86c429508b6e36e94151d58e533d961"} Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.795998 4883 generic.go:334] "Generic (PLEG): container finished" podID="ed7aa202-c734-4333-a1de-1bdb39d59804" containerID="8b81faa071a739cf8a7f25085f6d2124f3dcc3e17601b69b578e8e6f428069ce" exitCode=0 Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.796073 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8664j" event={"ID":"ed7aa202-c734-4333-a1de-1bdb39d59804","Type":"ContainerDied","Data":"8b81faa071a739cf8a7f25085f6d2124f3dcc3e17601b69b578e8e6f428069ce"} Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.797502 4883 generic.go:334] "Generic (PLEG): container finished" podID="07a8b78f-e864-49d5-9dfb-aebd86741885" containerID="9f05adebe53489f83df9e03cf5da9583790650f545f7218c9e2d571583c52501" exitCode=0 Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.797739 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fca8-account-create-update-7jkwx" event={"ID":"07a8b78f-e864-49d5-9dfb-aebd86741885","Type":"ContainerDied","Data":"9f05adebe53489f83df9e03cf5da9583790650f545f7218c9e2d571583c52501"} Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.177050 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.181943 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.350484 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-operator-scripts\") pod \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.350524 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7qwk\" (UniqueName: \"kubernetes.io/projected/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-kube-api-access-x7qwk\") pod \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.350554 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfmvc\" (UniqueName: \"kubernetes.io/projected/ed176738-d518-45e3-be47-3ace090d0e7a-kube-api-access-lfmvc\") pod \"ed176738-d518-45e3-be47-3ace090d0e7a\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.350581 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed176738-d518-45e3-be47-3ace090d0e7a-operator-scripts\") pod \"ed176738-d518-45e3-be47-3ace090d0e7a\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.351763 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed176738-d518-45e3-be47-3ace090d0e7a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed176738-d518-45e3-be47-3ace090d0e7a" (UID: "ed176738-d518-45e3-be47-3ace090d0e7a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.356549 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" (UID: "c357ec28-9cec-42e8-9e4d-dc1fb9960bc7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.367654 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed176738-d518-45e3-be47-3ace090d0e7a-kube-api-access-lfmvc" (OuterVolumeSpecName: "kube-api-access-lfmvc") pod "ed176738-d518-45e3-be47-3ace090d0e7a" (UID: "ed176738-d518-45e3-be47-3ace090d0e7a"). InnerVolumeSpecName "kube-api-access-lfmvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.376298 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-kube-api-access-x7qwk" (OuterVolumeSpecName: "kube-api-access-x7qwk") pod "c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" (UID: "c357ec28-9cec-42e8-9e4d-dc1fb9960bc7"). InnerVolumeSpecName "kube-api-access-x7qwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.454190 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.454224 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7qwk\" (UniqueName: \"kubernetes.io/projected/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-kube-api-access-x7qwk\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.454237 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfmvc\" (UniqueName: \"kubernetes.io/projected/ed176738-d518-45e3-be47-3ace090d0e7a-kube-api-access-lfmvc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.454248 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed176738-d518-45e3-be47-3ace090d0e7a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.812760 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.812763 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ef9a-account-create-update-4bwrd" event={"ID":"ed176738-d518-45e3-be47-3ace090d0e7a","Type":"ContainerDied","Data":"58b015606f21414ba323487161bd4b998e740b02422f73f204ccd263aafe0272"} Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.813171 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58b015606f21414ba323487161bd4b998e740b02422f73f204ccd263aafe0272" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.827133 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.827549 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-27c8-account-create-update-9w2q5" event={"ID":"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7","Type":"ContainerDied","Data":"3539e3cd3e3904f08f633f29769d6e5b99ec33a5b0a82fae13aeeab5f98d7f00"} Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.827603 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3539e3cd3e3904f08f633f29769d6e5b99ec33a5b0a82fae13aeeab5f98d7f00" Mar 10 09:22:26 crc kubenswrapper[4883]: I0310 09:22:26.387339 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:26 crc kubenswrapper[4883]: I0310 09:22:26.453875 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-ll7zk"] Mar 10 09:22:26 crc kubenswrapper[4883]: I0310 09:22:26.457259 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerName="dnsmasq-dns" containerID="cri-o://ab3d26f0cc1f7b7d4aa73624b5d0a56d07f024661893458f62e11a314a4cd5c5" gracePeriod=10 Mar 10 09:22:26 crc kubenswrapper[4883]: I0310 09:22:26.850148 4883 generic.go:334] "Generic (PLEG): container finished" podID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerID="ab3d26f0cc1f7b7d4aa73624b5d0a56d07f024661893458f62e11a314a4cd5c5" exitCode=0 Mar 10 09:22:26 crc kubenswrapper[4883]: I0310 09:22:26.850200 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" event={"ID":"7b8b44ba-0853-4287-a0ba-ef1607c66d7b","Type":"ContainerDied","Data":"ab3d26f0cc1f7b7d4aa73624b5d0a56d07f024661893458f62e11a314a4cd5c5"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.282396 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.322000 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.344564 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.356259 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8664j" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410589 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k94lm\" (UniqueName: \"kubernetes.io/projected/94df275b-e089-4e1f-8eac-e4806d2f1178-kube-api-access-k94lm\") pod \"94df275b-e089-4e1f-8eac-e4806d2f1178\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410637 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkbcf\" (UniqueName: \"kubernetes.io/projected/ed7aa202-c734-4333-a1de-1bdb39d59804-kube-api-access-kkbcf\") pod \"ed7aa202-c734-4333-a1de-1bdb39d59804\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410692 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94df275b-e089-4e1f-8eac-e4806d2f1178-operator-scripts\") pod \"94df275b-e089-4e1f-8eac-e4806d2f1178\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410717 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt7rx\" (UniqueName: \"kubernetes.io/projected/24713bd6-5868-43ec-94ec-2371a49a0b88-kube-api-access-lt7rx\") pod \"24713bd6-5868-43ec-94ec-2371a49a0b88\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410799 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6tlm\" (UniqueName: \"kubernetes.io/projected/07a8b78f-e864-49d5-9dfb-aebd86741885-kube-api-access-v6tlm\") pod \"07a8b78f-e864-49d5-9dfb-aebd86741885\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410841 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24713bd6-5868-43ec-94ec-2371a49a0b88-operator-scripts\") pod \"24713bd6-5868-43ec-94ec-2371a49a0b88\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410923 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7aa202-c734-4333-a1de-1bdb39d59804-operator-scripts\") pod \"ed7aa202-c734-4333-a1de-1bdb39d59804\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410944 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a8b78f-e864-49d5-9dfb-aebd86741885-operator-scripts\") pod \"07a8b78f-e864-49d5-9dfb-aebd86741885\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.411911 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a8b78f-e864-49d5-9dfb-aebd86741885-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "07a8b78f-e864-49d5-9dfb-aebd86741885" (UID: "07a8b78f-e864-49d5-9dfb-aebd86741885"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.412457 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed7aa202-c734-4333-a1de-1bdb39d59804-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed7aa202-c734-4333-a1de-1bdb39d59804" (UID: "ed7aa202-c734-4333-a1de-1bdb39d59804"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.412937 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24713bd6-5868-43ec-94ec-2371a49a0b88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24713bd6-5868-43ec-94ec-2371a49a0b88" (UID: "24713bd6-5868-43ec-94ec-2371a49a0b88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.413670 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94df275b-e089-4e1f-8eac-e4806d2f1178-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94df275b-e089-4e1f-8eac-e4806d2f1178" (UID: "94df275b-e089-4e1f-8eac-e4806d2f1178"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.415911 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24713bd6-5868-43ec-94ec-2371a49a0b88-kube-api-access-lt7rx" (OuterVolumeSpecName: "kube-api-access-lt7rx") pod "24713bd6-5868-43ec-94ec-2371a49a0b88" (UID: "24713bd6-5868-43ec-94ec-2371a49a0b88"). InnerVolumeSpecName "kube-api-access-lt7rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.415994 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a8b78f-e864-49d5-9dfb-aebd86741885-kube-api-access-v6tlm" (OuterVolumeSpecName: "kube-api-access-v6tlm") pod "07a8b78f-e864-49d5-9dfb-aebd86741885" (UID: "07a8b78f-e864-49d5-9dfb-aebd86741885"). InnerVolumeSpecName "kube-api-access-v6tlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.417679 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed7aa202-c734-4333-a1de-1bdb39d59804-kube-api-access-kkbcf" (OuterVolumeSpecName: "kube-api-access-kkbcf") pod "ed7aa202-c734-4333-a1de-1bdb39d59804" (UID: "ed7aa202-c734-4333-a1de-1bdb39d59804"). InnerVolumeSpecName "kube-api-access-kkbcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.417951 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94df275b-e089-4e1f-8eac-e4806d2f1178-kube-api-access-k94lm" (OuterVolumeSpecName: "kube-api-access-k94lm") pod "94df275b-e089-4e1f-8eac-e4806d2f1178" (UID: "94df275b-e089-4e1f-8eac-e4806d2f1178"). InnerVolumeSpecName "kube-api-access-k94lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.475138 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524510 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6tlm\" (UniqueName: \"kubernetes.io/projected/07a8b78f-e864-49d5-9dfb-aebd86741885-kube-api-access-v6tlm\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524551 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24713bd6-5868-43ec-94ec-2371a49a0b88-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524562 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7aa202-c734-4333-a1de-1bdb39d59804-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524684 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a8b78f-e864-49d5-9dfb-aebd86741885-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524702 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k94lm\" (UniqueName: \"kubernetes.io/projected/94df275b-e089-4e1f-8eac-e4806d2f1178-kube-api-access-k94lm\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524713 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkbcf\" (UniqueName: \"kubernetes.io/projected/ed7aa202-c734-4333-a1de-1bdb39d59804-kube-api-access-kkbcf\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524724 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94df275b-e089-4e1f-8eac-e4806d2f1178-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524734 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt7rx\" (UniqueName: \"kubernetes.io/projected/24713bd6-5868-43ec-94ec-2371a49a0b88-kube-api-access-lt7rx\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.626012 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config\") pod \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.626103 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-dns-svc\") pod \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.626279 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-nb\") pod \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.626392 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27qgz\" (UniqueName: \"kubernetes.io/projected/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-kube-api-access-27qgz\") pod \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.626436 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-sb\") pod \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.630168 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-kube-api-access-27qgz" (OuterVolumeSpecName: "kube-api-access-27qgz") pod "7b8b44ba-0853-4287-a0ba-ef1607c66d7b" (UID: "7b8b44ba-0853-4287-a0ba-ef1607c66d7b"). InnerVolumeSpecName "kube-api-access-27qgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.660988 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7b8b44ba-0853-4287-a0ba-ef1607c66d7b" (UID: "7b8b44ba-0853-4287-a0ba-ef1607c66d7b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.662247 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b8b44ba-0853-4287-a0ba-ef1607c66d7b" (UID: "7b8b44ba-0853-4287-a0ba-ef1607c66d7b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: E0310 09:22:27.666766 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config podName:7b8b44ba-0853-4287-a0ba-ef1607c66d7b nodeName:}" failed. No retries permitted until 2026-03-10 09:22:28.166731236 +0000 UTC m=+1134.421629135 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config") pod "7b8b44ba-0853-4287-a0ba-ef1607c66d7b" (UID: "7b8b44ba-0853-4287-a0ba-ef1607c66d7b") : error deleting /var/lib/kubelet/pods/7b8b44ba-0853-4287-a0ba-ef1607c66d7b/volume-subpaths: remove /var/lib/kubelet/pods/7b8b44ba-0853-4287-a0ba-ef1607c66d7b/volume-subpaths: no such file or directory Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.667047 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7b8b44ba-0853-4287-a0ba-ef1607c66d7b" (UID: "7b8b44ba-0853-4287-a0ba-ef1607c66d7b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.729919 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.729950 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.729965 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27qgz\" (UniqueName: \"kubernetes.io/projected/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-kube-api-access-27qgz\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.729975 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.861896 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hrq22" event={"ID":"24713bd6-5868-43ec-94ec-2371a49a0b88","Type":"ContainerDied","Data":"dbf762b1ecb64eb779af79450c42147ea6c3038641d43eb500690891663077b3"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.862012 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbf762b1ecb64eb779af79450c42147ea6c3038641d43eb500690891663077b3" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.862121 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.869773 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8664j" event={"ID":"ed7aa202-c734-4333-a1de-1bdb39d59804","Type":"ContainerDied","Data":"9b5b8b914e9f13ae13ab5c11566677509e8d1e77ce6e654bf5a93d7fab30b16c"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.869831 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b5b8b914e9f13ae13ab5c11566677509e8d1e77ce6e654bf5a93d7fab30b16c" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.869926 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8664j" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.874096 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fca8-account-create-update-7jkwx" event={"ID":"07a8b78f-e864-49d5-9dfb-aebd86741885","Type":"ContainerDied","Data":"4564e46ddd506f32a201d9f1057e104c34fe9ab13966b5aa8a79230af780d4c0"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.874147 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4564e46ddd506f32a201d9f1057e104c34fe9ab13966b5aa8a79230af780d4c0" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.874220 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.876978 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w5q6s" event={"ID":"96942836-243a-48c5-be3d-5eb5e5f166d0","Type":"ContainerStarted","Data":"e71d6d0bfc1a16e1dac8ed8107010142581b341f08eea68db466ac02a741d979"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.880280 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wzxkv" event={"ID":"94df275b-e089-4e1f-8eac-e4806d2f1178","Type":"ContainerDied","Data":"3ca9fa69a71dd94dd374770fb82f842201a6cc3e02147ecade5cd635caacef53"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.880308 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ca9fa69a71dd94dd374770fb82f842201a6cc3e02147ecade5cd635caacef53" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.880364 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.882225 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" event={"ID":"7b8b44ba-0853-4287-a0ba-ef1607c66d7b","Type":"ContainerDied","Data":"4fe8d37616588503394b5e0c543034c9d75405c62d38c6a3b4276c05a61d4d46"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.882268 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.882272 4883 scope.go:117] "RemoveContainer" containerID="ab3d26f0cc1f7b7d4aa73624b5d0a56d07f024661893458f62e11a314a4cd5c5" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.906222 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-w5q6s" podStartSLOduration=2.408408982 podStartE2EDuration="6.906211401s" podCreationTimestamp="2026-03-10 09:22:21 +0000 UTC" firstStartedPulling="2026-03-10 09:22:22.708995526 +0000 UTC m=+1128.963893415" lastFinishedPulling="2026-03-10 09:22:27.206797955 +0000 UTC m=+1133.461695834" observedRunningTime="2026-03-10 09:22:27.905101417 +0000 UTC m=+1134.159999306" watchObservedRunningTime="2026-03-10 09:22:27.906211401 +0000 UTC m=+1134.161109290" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.928638 4883 scope.go:117] "RemoveContainer" containerID="e241856c7e2d9bdc80ba8f22b6df1569df773a3d438f85a0d6ce70af2f1197e4" Mar 10 09:22:28 crc kubenswrapper[4883]: I0310 09:22:28.239320 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config\") pod \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " Mar 10 09:22:28 crc kubenswrapper[4883]: I0310 09:22:28.239757 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config" (OuterVolumeSpecName: "config") pod "7b8b44ba-0853-4287-a0ba-ef1607c66d7b" (UID: "7b8b44ba-0853-4287-a0ba-ef1607c66d7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:28 crc kubenswrapper[4883]: I0310 09:22:28.341381 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:28 crc kubenswrapper[4883]: I0310 09:22:28.513829 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-ll7zk"] Mar 10 09:22:28 crc kubenswrapper[4883]: I0310 09:22:28.519269 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-ll7zk"] Mar 10 09:22:29 crc kubenswrapper[4883]: I0310 09:22:29.900835 4883 generic.go:334] "Generic (PLEG): container finished" podID="96942836-243a-48c5-be3d-5eb5e5f166d0" containerID="e71d6d0bfc1a16e1dac8ed8107010142581b341f08eea68db466ac02a741d979" exitCode=0 Mar 10 09:22:29 crc kubenswrapper[4883]: I0310 09:22:29.900946 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w5q6s" event={"ID":"96942836-243a-48c5-be3d-5eb5e5f166d0","Type":"ContainerDied","Data":"e71d6d0bfc1a16e1dac8ed8107010142581b341f08eea68db466ac02a741d979"} Mar 10 09:22:30 crc kubenswrapper[4883]: I0310 09:22:30.091109 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" path="/var/lib/kubelet/pods/7b8b44ba-0853-4287-a0ba-ef1607c66d7b/volumes" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.199446 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.395846 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-combined-ca-bundle\") pod \"96942836-243a-48c5-be3d-5eb5e5f166d0\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.396119 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5vkl\" (UniqueName: \"kubernetes.io/projected/96942836-243a-48c5-be3d-5eb5e5f166d0-kube-api-access-s5vkl\") pod \"96942836-243a-48c5-be3d-5eb5e5f166d0\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.396239 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-config-data\") pod \"96942836-243a-48c5-be3d-5eb5e5f166d0\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.402802 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96942836-243a-48c5-be3d-5eb5e5f166d0-kube-api-access-s5vkl" (OuterVolumeSpecName: "kube-api-access-s5vkl") pod "96942836-243a-48c5-be3d-5eb5e5f166d0" (UID: "96942836-243a-48c5-be3d-5eb5e5f166d0"). InnerVolumeSpecName "kube-api-access-s5vkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.423267 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96942836-243a-48c5-be3d-5eb5e5f166d0" (UID: "96942836-243a-48c5-be3d-5eb5e5f166d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.440453 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-config-data" (OuterVolumeSpecName: "config-data") pod "96942836-243a-48c5-be3d-5eb5e5f166d0" (UID: "96942836-243a-48c5-be3d-5eb5e5f166d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.498514 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.499445 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5vkl\" (UniqueName: \"kubernetes.io/projected/96942836-243a-48c5-be3d-5eb5e5f166d0-kube-api-access-s5vkl\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.499487 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.925544 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w5q6s" event={"ID":"96942836-243a-48c5-be3d-5eb5e5f166d0","Type":"ContainerDied","Data":"be6c6a8d52e4e43dcf7deb7dc6d88e3b0cf1816a00b3324c09c224772a4f6fa7"} Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.925609 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be6c6a8d52e4e43dcf7deb7dc6d88e3b0cf1816a00b3324c09c224772a4f6fa7" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.925671 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157096 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-97zkl"] Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157617 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a8b78f-e864-49d5-9dfb-aebd86741885" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157659 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a8b78f-e864-49d5-9dfb-aebd86741885" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157707 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24713bd6-5868-43ec-94ec-2371a49a0b88" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157715 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="24713bd6-5868-43ec-94ec-2371a49a0b88" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157728 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94df275b-e089-4e1f-8eac-e4806d2f1178" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157734 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="94df275b-e089-4e1f-8eac-e4806d2f1178" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157762 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157769 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157778 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7aa202-c734-4333-a1de-1bdb39d59804" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157784 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7aa202-c734-4333-a1de-1bdb39d59804" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157805 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96942836-243a-48c5-be3d-5eb5e5f166d0" containerName="keystone-db-sync" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157812 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="96942836-243a-48c5-be3d-5eb5e5f166d0" containerName="keystone-db-sync" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157837 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed176738-d518-45e3-be47-3ace090d0e7a" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157845 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed176738-d518-45e3-be47-3ace090d0e7a" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157853 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerName="dnsmasq-dns" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157860 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerName="dnsmasq-dns" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157880 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerName="init" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157886 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerName="init" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158234 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed176738-d518-45e3-be47-3ace090d0e7a" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158259 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="94df275b-e089-4e1f-8eac-e4806d2f1178" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158269 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerName="dnsmasq-dns" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158277 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a8b78f-e864-49d5-9dfb-aebd86741885" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158290 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158297 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="24713bd6-5868-43ec-94ec-2371a49a0b88" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158309 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed7aa202-c734-4333-a1de-1bdb39d59804" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158317 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="96942836-243a-48c5-be3d-5eb5e5f166d0" containerName="keystone-db-sync" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.159338 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.172139 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-97zkl"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.202350 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2fk5l"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.203920 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.210524 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.210562 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.210637 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-92kmh" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.210852 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.210986 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216546 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gkb7\" (UniqueName: \"kubernetes.io/projected/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-kube-api-access-4gkb7\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216678 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-config\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216739 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-config-data\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216783 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216812 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-fernet-keys\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216832 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-credential-keys\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216852 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkgl9\" (UniqueName: \"kubernetes.io/projected/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-kube-api-access-pkgl9\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216954 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.217083 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.217126 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-svc\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.217152 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-scripts\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.217166 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-combined-ca-bundle\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.220392 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2fk5l"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318177 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318284 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318323 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-svc\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318345 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-scripts\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318362 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-combined-ca-bundle\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318404 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gkb7\" (UniqueName: \"kubernetes.io/projected/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-kube-api-access-4gkb7\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318447 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-config\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318489 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-config-data\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318707 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319622 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-fernet-keys\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319242 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319554 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-config\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319696 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319302 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319793 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-credential-keys\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319931 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-svc\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.320014 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkgl9\" (UniqueName: \"kubernetes.io/projected/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-kube-api-access-pkgl9\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.326344 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-credential-keys\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.329246 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-config-data\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.333468 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-fernet-keys\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.334059 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-scripts\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.343089 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-combined-ca-bundle\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.350352 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gkb7\" (UniqueName: \"kubernetes.io/projected/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-kube-api-access-4gkb7\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.353950 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-798c4d5785-ftwkg"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.356124 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.360237 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.360517 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.360541 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-n8r6x" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.363529 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.364627 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkgl9\" (UniqueName: \"kubernetes.io/projected/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-kube-api-access-pkgl9\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.376096 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-798c4d5785-ftwkg"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.410018 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-x2hf5"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.415262 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.420851 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421167 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421300 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-prwrq" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421354 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-db-sync-config-data\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421396 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-config-data\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421427 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-config-data\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421513 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dedd724b-83f5-408a-b4d5-08adb5d71cc0-horizon-secret-key\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421534 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-scripts\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421593 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-scripts\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421626 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lhjm\" (UniqueName: \"kubernetes.io/projected/dc0b1d9d-7834-473a-a487-6f540c606706-kube-api-access-6lhjm\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421691 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dedd724b-83f5-408a-b4d5-08adb5d71cc0-logs\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421722 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxrwr\" (UniqueName: \"kubernetes.io/projected/dedd724b-83f5-408a-b4d5-08adb5d71cc0-kube-api-access-fxrwr\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421763 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0b1d9d-7834-473a-a487-6f540c606706-etc-machine-id\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421880 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-combined-ca-bundle\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.443764 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-x2hf5"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.455386 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-kcmgr"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.456277 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.458359 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vpjch" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.458611 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.460356 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.474159 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kcmgr"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.475871 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523647 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dedd724b-83f5-408a-b4d5-08adb5d71cc0-horizon-secret-key\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523712 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-scripts\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523759 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-scripts\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523810 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lhjm\" (UniqueName: \"kubernetes.io/projected/dc0b1d9d-7834-473a-a487-6f540c606706-kube-api-access-6lhjm\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523872 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-combined-ca-bundle\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523902 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dedd724b-83f5-408a-b4d5-08adb5d71cc0-logs\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523919 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxrwr\" (UniqueName: \"kubernetes.io/projected/dedd724b-83f5-408a-b4d5-08adb5d71cc0-kube-api-access-fxrwr\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523966 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4slp7\" (UniqueName: \"kubernetes.io/projected/f5a25758-cb77-448b-a856-3dbc6df2bc21-kube-api-access-4slp7\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523988 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0b1d9d-7834-473a-a487-6f540c606706-etc-machine-id\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.524010 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-combined-ca-bundle\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.524065 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-db-sync-config-data\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.524111 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-config\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.524137 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-config-data\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.524156 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-config-data\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.525315 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-config-data\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.530948 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0b1d9d-7834-473a-a487-6f540c606706-etc-machine-id\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.537397 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-scripts\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.537787 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dedd724b-83f5-408a-b4d5-08adb5d71cc0-horizon-secret-key\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.543785 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.544339 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dedd724b-83f5-408a-b4d5-08adb5d71cc0-logs\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.544694 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-scripts\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.545598 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-config-data\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.546496 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-db-sync-config-data\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.546609 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-combined-ca-bundle\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.550292 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6889f87769-6j4vp"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.551648 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.572876 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.573963 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.577170 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.577439 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.577651 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-g4r2q" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.577865 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.581139 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6889f87769-6j4vp"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.594130 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxrwr\" (UniqueName: \"kubernetes.io/projected/dedd724b-83f5-408a-b4d5-08adb5d71cc0-kube-api-access-fxrwr\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.603015 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lhjm\" (UniqueName: \"kubernetes.io/projected/dc0b1d9d-7834-473a-a487-6f540c606706-kube-api-access-6lhjm\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.623148 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.625827 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4slp7\" (UniqueName: \"kubernetes.io/projected/f5a25758-cb77-448b-a856-3dbc6df2bc21-kube-api-access-4slp7\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.625868 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-scripts\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.625971 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3de0c9e-752c-487f-934f-170386d5f462-horizon-secret-key\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.625992 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-config\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.626022 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24tkk\" (UniqueName: \"kubernetes.io/projected/a3de0c9e-752c-487f-934f-170386d5f462-kube-api-access-24tkk\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.626101 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-config-data\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.626123 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3de0c9e-752c-487f-934f-170386d5f462-logs\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.626176 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-combined-ca-bundle\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.633872 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-config\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.635531 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-combined-ca-bundle\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.653326 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-jt8bs"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.654988 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.657767 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.657975 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-q2mjf" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.666930 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4slp7\" (UniqueName: \"kubernetes.io/projected/f5a25758-cb77-448b-a856-3dbc6df2bc21-kube-api-access-4slp7\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.708653 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.731896 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-config-data\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.732189 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3de0c9e-752c-487f-934f-170386d5f462-logs\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.732256 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksmfv\" (UniqueName: \"kubernetes.io/projected/ed02e2ef-66b0-48ae-b81b-20ac50baa423-kube-api-access-ksmfv\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.732612 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.732868 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.732917 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-scripts\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.733229 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3de0c9e-752c-487f-934f-170386d5f462-horizon-secret-key\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.733466 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24tkk\" (UniqueName: \"kubernetes.io/projected/a3de0c9e-752c-487f-934f-170386d5f462-kube-api-access-24tkk\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.733532 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.733633 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.733958 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.734190 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.734212 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.735375 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3de0c9e-752c-487f-934f-170386d5f462-logs\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.749626 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.751543 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-config-data\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.752155 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-scripts\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.773135 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.781722 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jt8bs"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.790099 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3de0c9e-752c-487f-934f-170386d5f462-horizon-secret-key\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.794985 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-v9wqz"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.797109 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.803877 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24tkk\" (UniqueName: \"kubernetes.io/projected/a3de0c9e-752c-487f-934f-170386d5f462-kube-api-access-24tkk\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.804140 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.810185 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4s72j" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.812823 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.836020 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841184 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841238 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841262 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841321 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksmfv\" (UniqueName: \"kubernetes.io/projected/ed02e2ef-66b0-48ae-b81b-20ac50baa423-kube-api-access-ksmfv\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841374 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841400 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841413 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841461 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-combined-ca-bundle\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841505 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psns8\" (UniqueName: \"kubernetes.io/projected/6d78560f-1b01-4ac1-9c36-109595422d78-kube-api-access-psns8\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841553 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-db-sync-config-data\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841587 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841612 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.842873 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.842972 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.843944 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.844505 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v9wqz"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.845952 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.851832 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.851967 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.854601 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.855683 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.855896 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-97zkl"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.866256 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.867882 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.868925 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksmfv\" (UniqueName: \"kubernetes.io/projected/ed02e2ef-66b0-48ae-b81b-20ac50baa423-kube-api-access-ksmfv\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.870873 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.873567 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-6pthc"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.877327 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.879137 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.879219 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.881102 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.887607 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.890784 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-6pthc"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.904038 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.936057 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945395 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z76vj\" (UniqueName: \"kubernetes.io/projected/ff8879ef-c406-48f4-a67a-d5cedcf8e886-kube-api-access-z76vj\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945446 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945492 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-log-httpd\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945512 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6975\" (UniqueName: \"kubernetes.io/projected/78bfcd03-74e4-4238-ae81-043bc04105cd-kube-api-access-x6975\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945537 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-config\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945554 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmmc9\" (UniqueName: \"kubernetes.io/projected/87232af6-dc87-4f68-8b1f-850fd98219a8-kube-api-access-wmmc9\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945589 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945606 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945626 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945646 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945667 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-scripts\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945681 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-logs\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945704 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-combined-ca-bundle\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945722 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-config-data\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945740 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78bfcd03-74e4-4238-ae81-043bc04105cd-logs\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945758 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psns8\" (UniqueName: \"kubernetes.io/projected/6d78560f-1b01-4ac1-9c36-109595422d78-kube-api-access-psns8\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945776 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj2sl\" (UniqueName: \"kubernetes.io/projected/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-kube-api-access-zj2sl\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945793 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-run-httpd\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945817 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-combined-ca-bundle\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945839 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.946422 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.946544 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.946584 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-db-sync-config-data\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.947173 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.948109 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.948176 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.948210 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-config-data\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.948235 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.948267 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-scripts\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.951526 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-db-sync-config-data\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.953012 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-combined-ca-bundle\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.977076 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psns8\" (UniqueName: \"kubernetes.io/projected/6d78560f-1b01-4ac1-9c36-109595422d78-kube-api-access-psns8\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.009654 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050684 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-combined-ca-bundle\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050742 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050772 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050803 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050836 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050865 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050890 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050912 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-config-data\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050936 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050975 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-scripts\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051011 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z76vj\" (UniqueName: \"kubernetes.io/projected/ff8879ef-c406-48f4-a67a-d5cedcf8e886-kube-api-access-z76vj\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051036 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051060 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-log-httpd\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051087 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6975\" (UniqueName: \"kubernetes.io/projected/78bfcd03-74e4-4238-ae81-043bc04105cd-kube-api-access-x6975\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051112 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-config\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051130 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmmc9\" (UniqueName: \"kubernetes.io/projected/87232af6-dc87-4f68-8b1f-850fd98219a8-kube-api-access-wmmc9\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051157 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051175 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051198 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051219 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051242 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-scripts\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051263 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-logs\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051286 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-config-data\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051306 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78bfcd03-74e4-4238-ae81-043bc04105cd-logs\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051328 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj2sl\" (UniqueName: \"kubernetes.io/projected/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-kube-api-access-zj2sl\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051345 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-run-httpd\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051772 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051849 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051928 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-run-httpd\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.052813 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.053159 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-config\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.054067 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-combined-ca-bundle\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.054871 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.055612 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.056427 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.058709 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-logs\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.066553 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.067995 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-log-httpd\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.068667 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.070357 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.072073 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.073794 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-config-data\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.074196 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-config-data\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.077692 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-scripts\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.077737 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z76vj\" (UniqueName: \"kubernetes.io/projected/ff8879ef-c406-48f4-a67a-d5cedcf8e886-kube-api-access-z76vj\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.077805 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-scripts\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.077885 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmmc9\" (UniqueName: \"kubernetes.io/projected/87232af6-dc87-4f68-8b1f-850fd98219a8-kube-api-access-wmmc9\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.078629 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.080328 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78bfcd03-74e4-4238-ae81-043bc04105cd-logs\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.082955 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj2sl\" (UniqueName: \"kubernetes.io/projected/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-kube-api-access-zj2sl\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.083659 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.084147 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6975\" (UniqueName: \"kubernetes.io/projected/78bfcd03-74e4-4238-ae81-043bc04105cd-kube-api-access-x6975\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.104012 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.113706 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.190381 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2fk5l"] Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.223978 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.229155 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.250027 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.250909 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-97zkl"] Mar 10 09:22:33 crc kubenswrapper[4883]: W0310 09:22:33.253750 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5cc1576_7d04_425c_a8a7_06ee1e8f00ce.slice/crio-fe85edaca127e9267ab954be5711233da1e83a78c7ac0a1124d47f6a6d59a071 WatchSource:0}: Error finding container fe85edaca127e9267ab954be5711233da1e83a78c7ac0a1124d47f6a6d59a071: Status 404 returned error can't find the container with id fe85edaca127e9267ab954be5711233da1e83a78c7ac0a1124d47f6a6d59a071 Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.279536 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.529237 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kcmgr"] Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.539562 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-798c4d5785-ftwkg"] Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.558438 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-x2hf5"] Mar 10 09:22:33 crc kubenswrapper[4883]: W0310 09:22:33.622769 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5a25758_cb77_448b_a856_3dbc6df2bc21.slice/crio-c85ef316b645cd5cbb2d862d97a1736c90cb354713f7f2807afbf6d10211284e WatchSource:0}: Error finding container c85ef316b645cd5cbb2d862d97a1736c90cb354713f7f2807afbf6d10211284e: Status 404 returned error can't find the container with id c85ef316b645cd5cbb2d862d97a1736c90cb354713f7f2807afbf6d10211284e Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.671809 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6889f87769-6j4vp"] Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.880524 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.925339 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jt8bs"] Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.974781 4883 generic.go:334] "Generic (PLEG): container finished" podID="c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" containerID="b6bf873909c8efbafc4e6b27c6fb3f0194fc7bb6f96a1e76fb3a715d4f30fda7" exitCode=0 Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.974875 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" event={"ID":"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce","Type":"ContainerDied","Data":"b6bf873909c8efbafc4e6b27c6fb3f0194fc7bb6f96a1e76fb3a715d4f30fda7"} Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.974927 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" event={"ID":"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce","Type":"ContainerStarted","Data":"fe85edaca127e9267ab954be5711233da1e83a78c7ac0a1124d47f6a6d59a071"} Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.977850 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed02e2ef-66b0-48ae-b81b-20ac50baa423","Type":"ContainerStarted","Data":"3484a773396c34bf3e6006181f0ce251b988a8818cc3d0e090ab197865e41fae"} Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.980067 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x2hf5" event={"ID":"dc0b1d9d-7834-473a-a487-6f540c606706","Type":"ContainerStarted","Data":"bd6a779566a1daa68a4f86119c6780e52d4493f0681838af48cbb4dbd90f52cc"} Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.981138 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-798c4d5785-ftwkg" event={"ID":"dedd724b-83f5-408a-b4d5-08adb5d71cc0","Type":"ContainerStarted","Data":"b811a98b9eb641299a8060183de1afd8f605b6eb8c5e07f91e568070217a7cad"} Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.988689 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6889f87769-6j4vp" event={"ID":"a3de0c9e-752c-487f-934f-170386d5f462","Type":"ContainerStarted","Data":"be7e77d7db4925cebe572b32a2b1ac5989560fd1627d674ac963510739fff6e0"} Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.997368 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kcmgr" event={"ID":"f5a25758-cb77-448b-a856-3dbc6df2bc21","Type":"ContainerStarted","Data":"c85ef316b645cd5cbb2d862d97a1736c90cb354713f7f2807afbf6d10211284e"} Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.001867 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2fk5l" event={"ID":"b0f9ff35-e1f8-4a30-9012-092af1e8ab09","Type":"ContainerStarted","Data":"11797267a9434882035fa9951a681f2d2f7a8ff3989245476cda7f186fafdc21"} Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.001890 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2fk5l" event={"ID":"b0f9ff35-e1f8-4a30-9012-092af1e8ab09","Type":"ContainerStarted","Data":"e5e0e3d6fe3e0b81dac2ef5b9e39c718ea08b72359b689574cb229810532f525"} Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.016103 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-kcmgr" podStartSLOduration=2.016086114 podStartE2EDuration="2.016086114s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:34.009186945 +0000 UTC m=+1140.264084834" watchObservedRunningTime="2026-03-10 09:22:34.016086114 +0000 UTC m=+1140.270984003" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.029257 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2fk5l" podStartSLOduration=2.029242623 podStartE2EDuration="2.029242623s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:34.025862069 +0000 UTC m=+1140.280759958" watchObservedRunningTime="2026-03-10 09:22:34.029242623 +0000 UTC m=+1140.284140512" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.057415 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v9wqz"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.183053 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-6pthc"] Mar 10 09:22:34 crc kubenswrapper[4883]: W0310 09:22:34.198517 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87232af6_dc87_4f68_8b1f_850fd98219a8.slice/crio-340a626f298a9ef4ea8b124b00075b5b0663514ede9873d187c8ba97a25a89bf WatchSource:0}: Error finding container 340a626f298a9ef4ea8b124b00075b5b0663514ede9873d187c8ba97a25a89bf: Status 404 returned error can't find the container with id 340a626f298a9ef4ea8b124b00075b5b0663514ede9873d187c8ba97a25a89bf Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.202055 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.251639 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.447735 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.523202 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-sb\") pod \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.523376 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-swift-storage-0\") pod \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.523450 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-svc\") pod \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.523483 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gkb7\" (UniqueName: \"kubernetes.io/projected/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-kube-api-access-4gkb7\") pod \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.523544 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-config\") pod \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.523628 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-nb\") pod \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.533050 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.558078 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6889f87769-6j4vp"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.573153 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-kube-api-access-4gkb7" (OuterVolumeSpecName: "kube-api-access-4gkb7") pod "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" (UID: "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce"). InnerVolumeSpecName "kube-api-access-4gkb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.575518 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54485c5c-cpvdz"] Mar 10 09:22:34 crc kubenswrapper[4883]: E0310 09:22:34.576257 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" containerName="init" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.576280 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" containerName="init" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.576637 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" containerName="init" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.583564 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.620424 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54485c5c-cpvdz"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.635537 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gkb7\" (UniqueName: \"kubernetes.io/projected/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-kube-api-access-4gkb7\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.650556 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.659560 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.678592 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" (UID: "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.691102 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-config" (OuterVolumeSpecName: "config") pod "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" (UID: "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.693128 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" (UID: "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.694077 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" (UID: "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.727913 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" (UID: "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738404 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-scripts\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738501 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-config-data\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738529 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-horizon-secret-key\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738681 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49zt\" (UniqueName: \"kubernetes.io/projected/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-kube-api-access-b49zt\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738756 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-logs\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738824 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738842 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738852 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738862 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738871 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.841274 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49zt\" (UniqueName: \"kubernetes.io/projected/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-kube-api-access-b49zt\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.841389 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-logs\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.841522 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-scripts\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.841595 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-config-data\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.841623 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-horizon-secret-key\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.842228 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-logs\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.842896 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-scripts\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.843602 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-config-data\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.849149 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-horizon-secret-key\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.856275 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49zt\" (UniqueName: \"kubernetes.io/projected/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-kube-api-access-b49zt\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.865884 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.032657 4883 generic.go:334] "Generic (PLEG): container finished" podID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerID="e9f3a5dd63ead621338a9597451694f38d0bb7781f2624ce640a6d0a6dc7e4a2" exitCode=0 Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.032755 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" event={"ID":"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499","Type":"ContainerDied","Data":"e9f3a5dd63ead621338a9597451694f38d0bb7781f2624ce640a6d0a6dc7e4a2"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.033017 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" event={"ID":"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499","Type":"ContainerStarted","Data":"597e041b53a7dcfb1def658755f45ca307eb7a79b514a35cb1ad87244e150850"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.038608 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerStarted","Data":"340a626f298a9ef4ea8b124b00075b5b0663514ede9873d187c8ba97a25a89bf"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.045210 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" event={"ID":"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce","Type":"ContainerDied","Data":"fe85edaca127e9267ab954be5711233da1e83a78c7ac0a1124d47f6a6d59a071"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.045248 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.045302 4883 scope.go:117] "RemoveContainer" containerID="b6bf873909c8efbafc4e6b27c6fb3f0194fc7bb6f96a1e76fb3a715d4f30fda7" Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.049909 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v9wqz" event={"ID":"78bfcd03-74e4-4238-ae81-043bc04105cd","Type":"ContainerStarted","Data":"a9fc6acc617749c8e1de867b10f66ead08446875d606f1370db6c642ea9067e0"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.057542 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed02e2ef-66b0-48ae-b81b-20ac50baa423","Type":"ContainerStarted","Data":"49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.098846 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jt8bs" event={"ID":"6d78560f-1b01-4ac1-9c36-109595422d78","Type":"ContainerStarted","Data":"1ecd75dacc64ce6638ff3cb4e1b982b79b4377e09b6132685f58b6c4877440c0"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.105956 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff8879ef-c406-48f4-a67a-d5cedcf8e886","Type":"ContainerStarted","Data":"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.105987 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff8879ef-c406-48f4-a67a-d5cedcf8e886","Type":"ContainerStarted","Data":"d2f4c6a75b56fe5a8d7adbca419f37ccbde86d92a924952b6cc9449838c4a617"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.111358 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kcmgr" event={"ID":"f5a25758-cb77-448b-a856-3dbc6df2bc21","Type":"ContainerStarted","Data":"50e03d7f0d395af101a158c665d03ed09d9fb01ebb9c16be32b6ee2d458d2bcb"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.122527 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-97zkl"] Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.130158 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-97zkl"] Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.387807 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54485c5c-cpvdz"] Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.098073 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" path="/var/lib/kubelet/pods/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce/volumes" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.135193 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-log" containerID="cri-o://49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922" gracePeriod=30 Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.135259 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed02e2ef-66b0-48ae-b81b-20ac50baa423","Type":"ContainerStarted","Data":"1eff90bb60bf17fc9b64fa1d0a5ad1ed0278d91c9ac3bff3d2ef03b8ed5135f1"} Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.135275 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-httpd" containerID="cri-o://1eff90bb60bf17fc9b64fa1d0a5ad1ed0278d91c9ac3bff3d2ef03b8ed5135f1" gracePeriod=30 Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.144786 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54485c5c-cpvdz" event={"ID":"6dde4634-36a7-4142-99e2-9a43f0fbe3fd","Type":"ContainerStarted","Data":"c40847d7eb6f2a5c70938469d31c3d3e46cacefe9ad418b40434e9f2175988e6"} Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.157494 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.157461408 podStartE2EDuration="4.157461408s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:36.154759083 +0000 UTC m=+1142.409656972" watchObservedRunningTime="2026-03-10 09:22:36.157461408 +0000 UTC m=+1142.412359297" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.162455 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-log" containerID="cri-o://883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f" gracePeriod=30 Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.162806 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-httpd" containerID="cri-o://8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb" gracePeriod=30 Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.162508 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff8879ef-c406-48f4-a67a-d5cedcf8e886","Type":"ContainerStarted","Data":"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb"} Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.172692 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" event={"ID":"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499","Type":"ContainerStarted","Data":"368d8e92de1f4ea2c761213a4a664cf7368f309a09685b643b23b733208da7ea"} Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.172730 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.184533 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.184520543 podStartE2EDuration="4.184520543s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:36.183154788 +0000 UTC m=+1142.438052667" watchObservedRunningTime="2026-03-10 09:22:36.184520543 +0000 UTC m=+1142.439418433" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.222245 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" podStartSLOduration=4.222222338 podStartE2EDuration="4.222222338s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:36.205014438 +0000 UTC m=+1142.459912327" watchObservedRunningTime="2026-03-10 09:22:36.222222338 +0000 UTC m=+1142.477120227" Mar 10 09:22:36 crc kubenswrapper[4883]: E0310 09:22:36.313299 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff8879ef_c406_48f4_a67a_d5cedcf8e886.slice/crio-8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff8879ef_c406_48f4_a67a_d5cedcf8e886.slice/crio-conmon-8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded02e2ef_66b0_48ae_b81b_20ac50baa423.slice/crio-conmon-49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded02e2ef_66b0_48ae_b81b_20ac50baa423.slice/crio-49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.832596 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907020 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-scripts\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907068 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-logs\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907107 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-config-data\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907219 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z76vj\" (UniqueName: \"kubernetes.io/projected/ff8879ef-c406-48f4-a67a-d5cedcf8e886-kube-api-access-z76vj\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907239 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907273 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-public-tls-certs\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907392 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-combined-ca-bundle\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907491 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-httpd-run\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.908151 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-logs" (OuterVolumeSpecName: "logs") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.908273 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.908673 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.908693 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.915508 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.917688 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8879ef-c406-48f4-a67a-d5cedcf8e886-kube-api-access-z76vj" (OuterVolumeSpecName: "kube-api-access-z76vj") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "kube-api-access-z76vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.944435 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-scripts" (OuterVolumeSpecName: "scripts") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.950587 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.969323 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-config-data" (OuterVolumeSpecName: "config-data") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.990080 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.010081 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.010107 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.010118 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.010127 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z76vj\" (UniqueName: \"kubernetes.io/projected/ff8879ef-c406-48f4-a67a-d5cedcf8e886-kube-api-access-z76vj\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.010159 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.010168 4883 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.030466 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.112371 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.182968 4883 generic.go:334] "Generic (PLEG): container finished" podID="b0f9ff35-e1f8-4a30-9012-092af1e8ab09" containerID="11797267a9434882035fa9951a681f2d2f7a8ff3989245476cda7f186fafdc21" exitCode=0 Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.183074 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2fk5l" event={"ID":"b0f9ff35-e1f8-4a30-9012-092af1e8ab09","Type":"ContainerDied","Data":"11797267a9434882035fa9951a681f2d2f7a8ff3989245476cda7f186fafdc21"} Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.186950 4883 generic.go:334] "Generic (PLEG): container finished" podID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerID="1eff90bb60bf17fc9b64fa1d0a5ad1ed0278d91c9ac3bff3d2ef03b8ed5135f1" exitCode=0 Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.186976 4883 generic.go:334] "Generic (PLEG): container finished" podID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerID="49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922" exitCode=143 Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.187033 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed02e2ef-66b0-48ae-b81b-20ac50baa423","Type":"ContainerDied","Data":"1eff90bb60bf17fc9b64fa1d0a5ad1ed0278d91c9ac3bff3d2ef03b8ed5135f1"} Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.187102 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed02e2ef-66b0-48ae-b81b-20ac50baa423","Type":"ContainerDied","Data":"49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922"} Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.190536 4883 generic.go:334] "Generic (PLEG): container finished" podID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerID="8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb" exitCode=143 Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.190557 4883 generic.go:334] "Generic (PLEG): container finished" podID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerID="883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f" exitCode=143 Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.190630 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.190646 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff8879ef-c406-48f4-a67a-d5cedcf8e886","Type":"ContainerDied","Data":"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb"} Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.190793 4883 scope.go:117] "RemoveContainer" containerID="8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.190720 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff8879ef-c406-48f4-a67a-d5cedcf8e886","Type":"ContainerDied","Data":"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f"} Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.191121 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff8879ef-c406-48f4-a67a-d5cedcf8e886","Type":"ContainerDied","Data":"d2f4c6a75b56fe5a8d7adbca419f37ccbde86d92a924952b6cc9449838c4a617"} Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.254874 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.264053 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.265390 4883 scope.go:117] "RemoveContainer" containerID="883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.279340 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:37 crc kubenswrapper[4883]: E0310 09:22:37.279965 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-log" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.279986 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-log" Mar 10 09:22:37 crc kubenswrapper[4883]: E0310 09:22:37.279998 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-httpd" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.280006 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-httpd" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.280436 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-log" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.280460 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-httpd" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.281826 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.284550 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.284940 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.286698 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.304523 4883 scope.go:117] "RemoveContainer" containerID="8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb" Mar 10 09:22:37 crc kubenswrapper[4883]: E0310 09:22:37.305210 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb\": container with ID starting with 8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb not found: ID does not exist" containerID="8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.305244 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb"} err="failed to get container status \"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb\": rpc error: code = NotFound desc = could not find container \"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb\": container with ID starting with 8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb not found: ID does not exist" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.306182 4883 scope.go:117] "RemoveContainer" containerID="883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f" Mar 10 09:22:37 crc kubenswrapper[4883]: E0310 09:22:37.308565 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f\": container with ID starting with 883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f not found: ID does not exist" containerID="883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.308609 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f"} err="failed to get container status \"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f\": rpc error: code = NotFound desc = could not find container \"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f\": container with ID starting with 883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f not found: ID does not exist" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.308635 4883 scope.go:117] "RemoveContainer" containerID="8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.309160 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb"} err="failed to get container status \"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb\": rpc error: code = NotFound desc = could not find container \"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb\": container with ID starting with 8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb not found: ID does not exist" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.309189 4883 scope.go:117] "RemoveContainer" containerID="883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.309630 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f"} err="failed to get container status \"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f\": rpc error: code = NotFound desc = could not find container \"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f\": container with ID starting with 883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f not found: ID does not exist" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.424306 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.424720 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-logs\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.425094 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.425173 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.425296 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.425364 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.425521 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26kq2\" (UniqueName: \"kubernetes.io/projected/37cf08e9-e871-4798-8990-1c80f7776d8f-kube-api-access-26kq2\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.425586 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.527695 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26kq2\" (UniqueName: \"kubernetes.io/projected/37cf08e9-e871-4798-8990-1c80f7776d8f-kube-api-access-26kq2\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.527732 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.527858 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.527887 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-logs\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.528141 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.528191 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.528449 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.528672 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.528950 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.535002 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-logs\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.536947 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.537101 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.537111 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.537759 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.543191 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.545969 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26kq2\" (UniqueName: \"kubernetes.io/projected/37cf08e9-e871-4798-8990-1c80f7776d8f-kube-api-access-26kq2\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.563403 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.605970 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.820291 4883 scope.go:117] "RemoveContainer" containerID="3c695c439ea9e2ad9e771b20e1905dadd59374ebe052ec433b69ea1e82161c99" Mar 10 09:22:38 crc kubenswrapper[4883]: I0310 09:22:38.091380 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" path="/var/lib/kubelet/pods/ff8879ef-c406-48f4-a67a-d5cedcf8e886/volumes" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.889986 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-798c4d5785-ftwkg"] Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.924830 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.937806 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5fcc9bbb48-lf4jb"] Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.939140 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.942008 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.964515 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fcc9bbb48-lf4jb"] Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997125 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-secret-key\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997193 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-tls-certs\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997368 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scthm\" (UniqueName: \"kubernetes.io/projected/a4909549-f2c4-45b0-a8f8-521302991297-kube-api-access-scthm\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997409 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-config-data\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997457 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-combined-ca-bundle\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997507 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4909549-f2c4-45b0-a8f8-521302991297-logs\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997637 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-scripts\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.036566 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54485c5c-cpvdz"] Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.068515 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-555c96ddb-t7tcm"] Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.070071 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.090831 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-555c96ddb-t7tcm"] Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119655 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-scripts\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119741 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-secret-key\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119771 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-tls-certs\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119873 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scthm\" (UniqueName: \"kubernetes.io/projected/a4909549-f2c4-45b0-a8f8-521302991297-kube-api-access-scthm\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119903 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-config-data\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119938 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-combined-ca-bundle\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119964 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4909549-f2c4-45b0-a8f8-521302991297-logs\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.124825 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4909549-f2c4-45b0-a8f8-521302991297-logs\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.125408 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-scripts\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.129240 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-secret-key\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.130539 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-config-data\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.131041 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-tls-certs\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.132020 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-combined-ca-bundle\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.159281 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scthm\" (UniqueName: \"kubernetes.io/projected/a4909549-f2c4-45b0-a8f8-521302991297-kube-api-access-scthm\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.222678 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef0598ad-c7ea-4645-b553-7d9028397156-config-data\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.222737 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mbr7\" (UniqueName: \"kubernetes.io/projected/ef0598ad-c7ea-4645-b553-7d9028397156-kube-api-access-7mbr7\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.222802 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-horizon-tls-certs\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.222836 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-horizon-secret-key\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.222859 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef0598ad-c7ea-4645-b553-7d9028397156-scripts\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.223538 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-combined-ca-bundle\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.223686 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef0598ad-c7ea-4645-b553-7d9028397156-logs\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.261036 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.326884 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-horizon-tls-certs\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.326956 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-horizon-secret-key\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.326981 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef0598ad-c7ea-4645-b553-7d9028397156-scripts\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.327083 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-combined-ca-bundle\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.327136 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef0598ad-c7ea-4645-b553-7d9028397156-logs\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.327187 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef0598ad-c7ea-4645-b553-7d9028397156-config-data\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.327230 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mbr7\" (UniqueName: \"kubernetes.io/projected/ef0598ad-c7ea-4645-b553-7d9028397156-kube-api-access-7mbr7\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.327750 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef0598ad-c7ea-4645-b553-7d9028397156-logs\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.327862 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef0598ad-c7ea-4645-b553-7d9028397156-scripts\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.328939 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef0598ad-c7ea-4645-b553-7d9028397156-config-data\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.330763 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-horizon-secret-key\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.331041 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-combined-ca-bundle\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.331944 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-horizon-tls-certs\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.344054 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mbr7\" (UniqueName: \"kubernetes.io/projected/ef0598ad-c7ea-4645-b553-7d9028397156-kube-api-access-7mbr7\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.394194 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.845981 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.940489 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-combined-ca-bundle\") pod \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.940561 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-credential-keys\") pod \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.940625 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-scripts\") pod \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.941321 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkgl9\" (UniqueName: \"kubernetes.io/projected/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-kube-api-access-pkgl9\") pod \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.941351 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-config-data\") pod \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.941400 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-fernet-keys\") pod \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.946528 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b0f9ff35-e1f8-4a30-9012-092af1e8ab09" (UID: "b0f9ff35-e1f8-4a30-9012-092af1e8ab09"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.946887 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-kube-api-access-pkgl9" (OuterVolumeSpecName: "kube-api-access-pkgl9") pod "b0f9ff35-e1f8-4a30-9012-092af1e8ab09" (UID: "b0f9ff35-e1f8-4a30-9012-092af1e8ab09"). InnerVolumeSpecName "kube-api-access-pkgl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.947902 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b0f9ff35-e1f8-4a30-9012-092af1e8ab09" (UID: "b0f9ff35-e1f8-4a30-9012-092af1e8ab09"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.950054 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-scripts" (OuterVolumeSpecName: "scripts") pod "b0f9ff35-e1f8-4a30-9012-092af1e8ab09" (UID: "b0f9ff35-e1f8-4a30-9012-092af1e8ab09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.964932 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-config-data" (OuterVolumeSpecName: "config-data") pod "b0f9ff35-e1f8-4a30-9012-092af1e8ab09" (UID: "b0f9ff35-e1f8-4a30-9012-092af1e8ab09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.966120 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0f9ff35-e1f8-4a30-9012-092af1e8ab09" (UID: "b0f9ff35-e1f8-4a30-9012-092af1e8ab09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.044661 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.044697 4883 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.044710 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.044722 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkgl9\" (UniqueName: \"kubernetes.io/projected/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-kube-api-access-pkgl9\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.044735 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.044743 4883 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.258525 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2fk5l" event={"ID":"b0f9ff35-e1f8-4a30-9012-092af1e8ab09","Type":"ContainerDied","Data":"e5e0e3d6fe3e0b81dac2ef5b9e39c718ea08b72359b689574cb229810532f525"} Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.258590 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5e0e3d6fe3e0b81dac2ef5b9e39c718ea08b72359b689574cb229810532f525" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.258625 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.014085 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2fk5l"] Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.023496 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2fk5l"] Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.115587 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-n7f74"] Mar 10 09:22:43 crc kubenswrapper[4883]: E0310 09:22:43.116035 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f9ff35-e1f8-4a30-9012-092af1e8ab09" containerName="keystone-bootstrap" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.116051 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f9ff35-e1f8-4a30-9012-092af1e8ab09" containerName="keystone-bootstrap" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.116293 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f9ff35-e1f8-4a30-9012-092af1e8ab09" containerName="keystone-bootstrap" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.118616 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.120417 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.120690 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.120791 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-92kmh" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.120868 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.121110 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.121942 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n7f74"] Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.269560 4883 generic.go:334] "Generic (PLEG): container finished" podID="f5a25758-cb77-448b-a856-3dbc6df2bc21" containerID="50e03d7f0d395af101a158c665d03ed09d9fb01ebb9c16be32b6ee2d458d2bcb" exitCode=0 Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.269655 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kcmgr" event={"ID":"f5a25758-cb77-448b-a856-3dbc6df2bc21","Type":"ContainerDied","Data":"50e03d7f0d395af101a158c665d03ed09d9fb01ebb9c16be32b6ee2d458d2bcb"} Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.274763 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-fernet-keys\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.274906 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-scripts\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.275022 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-credential-keys\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.275178 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mmp6\" (UniqueName: \"kubernetes.io/projected/5acdd73a-9879-4507-8f6d-10e2ad8065e4-kube-api-access-7mmp6\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.275263 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-combined-ca-bundle\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.275322 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-config-data\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.282610 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.350108 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-scjqt"] Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.350364 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" containerID="cri-o://751aa4bf679ccd2fb9310821003d8071e637bbcc3be2f295875a2abfad364ce2" gracePeriod=10 Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.377387 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-config-data\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.377652 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-fernet-keys\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.377875 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-scripts\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.377950 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-credential-keys\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.378037 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mmp6\" (UniqueName: \"kubernetes.io/projected/5acdd73a-9879-4507-8f6d-10e2ad8065e4-kube-api-access-7mmp6\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.378096 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-combined-ca-bundle\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.381797 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-fernet-keys\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.382038 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-credential-keys\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.382131 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-config-data\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.390707 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-combined-ca-bundle\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.396834 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mmp6\" (UniqueName: \"kubernetes.io/projected/5acdd73a-9879-4507-8f6d-10e2ad8065e4-kube-api-access-7mmp6\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.398005 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-scripts\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.437607 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:44 crc kubenswrapper[4883]: I0310 09:22:44.091749 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0f9ff35-e1f8-4a30-9012-092af1e8ab09" path="/var/lib/kubelet/pods/b0f9ff35-e1f8-4a30-9012-092af1e8ab09/volumes" Mar 10 09:22:44 crc kubenswrapper[4883]: I0310 09:22:44.282889 4883 generic.go:334] "Generic (PLEG): container finished" podID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerID="751aa4bf679ccd2fb9310821003d8071e637bbcc3be2f295875a2abfad364ce2" exitCode=0 Mar 10 09:22:44 crc kubenswrapper[4883]: I0310 09:22:44.282959 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" event={"ID":"631a0fc2-de6d-4778-bce2-46b69c306e44","Type":"ContainerDied","Data":"751aa4bf679ccd2fb9310821003d8071e637bbcc3be2f295875a2abfad364ce2"} Mar 10 09:22:46 crc kubenswrapper[4883]: I0310 09:22:46.386044 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Mar 10 09:22:47 crc kubenswrapper[4883]: I0310 09:22:47.449043 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:22:47 crc kubenswrapper[4883]: I0310 09:22:47.449128 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.285073 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.286053 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n99h694h655h66fhfdh58fh66bh58ch5f4h5b7h6ch97hc5h644h5fbh554hbch646h68fh74h64fh5d4h5c7h5fdh5b9h5cdh56h5bdh5d6h7dhbh567q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-24tkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6889f87769-6j4vp_openstack(a3de0c9e-752c-487f-934f-170386d5f462): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.288189 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b\\\"\"]" pod="openstack/horizon-6889f87769-6j4vp" podUID="a3de0c9e-752c-487f-934f-170386d5f462" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.291672 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.291934 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n6bh648h56chd4hbh574h586h5cdh659h566h5c8h75h79h65dh58bh5cdh684h595hb5h5d7hb6h66dh578h575h6ch544h5dbh54fh544hb6h558h686q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b49zt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-54485c5c-cpvdz_openstack(6dde4634-36a7-4142-99e2-9a43f0fbe3fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.295019 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b\\\"\"]" pod="openstack/horizon-54485c5c-cpvdz" podUID="6dde4634-36a7-4142-99e2-9a43f0fbe3fd" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.345606 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.356143 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.407105 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kcmgr" event={"ID":"f5a25758-cb77-448b-a856-3dbc6df2bc21","Type":"ContainerDied","Data":"c85ef316b645cd5cbb2d862d97a1736c90cb354713f7f2807afbf6d10211284e"} Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.407157 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c85ef316b645cd5cbb2d862d97a1736c90cb354713f7f2807afbf6d10211284e" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.407229 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.410510 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed02e2ef-66b0-48ae-b81b-20ac50baa423","Type":"ContainerDied","Data":"3484a773396c34bf3e6006181f0ce251b988a8818cc3d0e090ab197865e41fae"} Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.410562 4883 scope.go:117] "RemoveContainer" containerID="1eff90bb60bf17fc9b64fa1d0a5ad1ed0278d91c9ac3bff3d2ef03b8ed5135f1" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.410807 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438083 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksmfv\" (UniqueName: \"kubernetes.io/projected/ed02e2ef-66b0-48ae-b81b-20ac50baa423-kube-api-access-ksmfv\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438205 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-logs\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438242 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4slp7\" (UniqueName: \"kubernetes.io/projected/f5a25758-cb77-448b-a856-3dbc6df2bc21-kube-api-access-4slp7\") pod \"f5a25758-cb77-448b-a856-3dbc6df2bc21\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438263 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-combined-ca-bundle\") pod \"f5a25758-cb77-448b-a856-3dbc6df2bc21\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438299 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-config\") pod \"f5a25758-cb77-448b-a856-3dbc6df2bc21\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438337 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-scripts\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438389 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-config-data\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438525 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-httpd-run\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438623 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438672 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-internal-tls-certs\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438700 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-combined-ca-bundle\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438720 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-logs" (OuterVolumeSpecName: "logs") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.439325 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.441102 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.454727 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-scripts" (OuterVolumeSpecName: "scripts") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.454815 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.464737 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed02e2ef-66b0-48ae-b81b-20ac50baa423-kube-api-access-ksmfv" (OuterVolumeSpecName: "kube-api-access-ksmfv") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "kube-api-access-ksmfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.465079 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a25758-cb77-448b-a856-3dbc6df2bc21-kube-api-access-4slp7" (OuterVolumeSpecName: "kube-api-access-4slp7") pod "f5a25758-cb77-448b-a856-3dbc6df2bc21" (UID: "f5a25758-cb77-448b-a856-3dbc6df2bc21"). InnerVolumeSpecName "kube-api-access-4slp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.469877 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-config" (OuterVolumeSpecName: "config") pod "f5a25758-cb77-448b-a856-3dbc6df2bc21" (UID: "f5a25758-cb77-448b-a856-3dbc6df2bc21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.474798 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5a25758-cb77-448b-a856-3dbc6df2bc21" (UID: "f5a25758-cb77-448b-a856-3dbc6df2bc21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.480102 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.496886 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.505380 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-config-data" (OuterVolumeSpecName: "config-data") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544301 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544355 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544370 4883 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544380 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544390 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksmfv\" (UniqueName: \"kubernetes.io/projected/ed02e2ef-66b0-48ae-b81b-20ac50baa423-kube-api-access-ksmfv\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544402 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4slp7\" (UniqueName: \"kubernetes.io/projected/f5a25758-cb77-448b-a856-3dbc6df2bc21-kube-api-access-4slp7\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544411 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544420 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544428 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544437 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.559188 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.650408 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.753326 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.759367 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791088 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.791602 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a25758-cb77-448b-a856-3dbc6df2bc21" containerName="neutron-db-sync" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791623 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a25758-cb77-448b-a856-3dbc6df2bc21" containerName="neutron-db-sync" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.791649 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-httpd" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791656 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-httpd" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.791666 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-log" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791674 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-log" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791837 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-log" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791856 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-httpd" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791864 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5a25758-cb77-448b-a856-3dbc6df2bc21" containerName="neutron-db-sync" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.792930 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.794854 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.796065 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.805872 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.956931 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957343 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957403 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957436 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957489 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-logs\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957530 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff6w4\" (UniqueName: \"kubernetes.io/projected/f9967357-b98f-4e31-9934-f99669b31024-kube-api-access-ff6w4\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957641 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957757 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059670 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059825 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059889 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059917 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059937 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-logs\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059969 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff6w4\" (UniqueName: \"kubernetes.io/projected/f9967357-b98f-4e31-9934-f99669b31024-kube-api-access-ff6w4\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.060062 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.060095 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059914 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.060736 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-logs\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.060814 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.066444 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.066599 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.067021 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.068085 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.076112 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff6w4\" (UniqueName: \"kubernetes.io/projected/f9967357-b98f-4e31-9934-f99669b31024-kube-api-access-ff6w4\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.080081 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.089921 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" path="/var/lib/kubelet/pods/ed02e2ef-66b0-48ae-b81b-20ac50baa423/volumes" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.112760 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.514258 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-nf489"] Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.517763 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.536276 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-nf489"] Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.600446 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f6f8846bd-rdwfd"] Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.603028 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.609936 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vpjch" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.610157 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.610899 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.617154 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.655596 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f6f8846bd-rdwfd"] Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.677951 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.678711 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.679014 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-svc\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.679151 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-config\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.679279 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.679456 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27k8t\" (UniqueName: \"kubernetes.io/projected/bb1924b3-495e-4d89-a314-5cc86d567758-kube-api-access-27k8t\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781071 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc9sx\" (UniqueName: \"kubernetes.io/projected/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-kube-api-access-xc9sx\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781134 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-svc\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781190 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-config\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781225 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-config\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781256 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781277 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-httpd-config\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781315 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27k8t\" (UniqueName: \"kubernetes.io/projected/bb1924b3-495e-4d89-a314-5cc86d567758-kube-api-access-27k8t\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781352 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781374 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-ovndb-tls-certs\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781394 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-combined-ca-bundle\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781412 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.782227 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.783256 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-svc\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.783848 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-config\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.784323 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.785077 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.802784 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27k8t\" (UniqueName: \"kubernetes.io/projected/bb1924b3-495e-4d89-a314-5cc86d567758-kube-api-access-27k8t\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.844221 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.883394 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc9sx\" (UniqueName: \"kubernetes.io/projected/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-kube-api-access-xc9sx\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.883530 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-config\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.883566 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-httpd-config\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.883666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-ovndb-tls-certs\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.883689 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-combined-ca-bundle\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.888246 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-config\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.888874 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-httpd-config\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.891621 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-ovndb-tls-certs\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.894681 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-combined-ca-bundle\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.896945 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc9sx\" (UniqueName: \"kubernetes.io/projected/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-kube-api-access-xc9sx\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.923834 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.386573 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.610441 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5bc48b486f-2j899"] Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.611860 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.614421 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.614743 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.618951 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bc48b486f-2j899"] Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621121 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-httpd-config\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621168 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj9nk\" (UniqueName: \"kubernetes.io/projected/77895d16-8ad3-4edb-ae91-d807afd499b3-kube-api-access-xj9nk\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621200 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-internal-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621261 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-ovndb-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621287 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-config\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621317 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-combined-ca-bundle\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621361 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-public-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723530 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj9nk\" (UniqueName: \"kubernetes.io/projected/77895d16-8ad3-4edb-ae91-d807afd499b3-kube-api-access-xj9nk\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723581 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-httpd-config\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723615 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-internal-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723676 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-ovndb-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723705 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-config\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723739 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-combined-ca-bundle\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723786 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-public-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.729852 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-httpd-config\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.730113 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-config\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.740453 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-internal-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.741889 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-public-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.742000 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-combined-ca-bundle\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.742879 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj9nk\" (UniqueName: \"kubernetes.io/projected/77895d16-8ad3-4edb-ae91-d807afd499b3-kube-api-access-xj9nk\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.743048 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-ovndb-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.941874 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:57 crc kubenswrapper[4883]: E0310 09:22:57.317534 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:11d4431e4af1735fbd9d425596f81dd62b0ca934d84d7c4e67902656c2b688d3" Mar 10 09:22:57 crc kubenswrapper[4883]: E0310 09:22:57.317990 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:11d4431e4af1735fbd9d425596f81dd62b0ca934d84d7c4e67902656c2b688d3,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6975,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-v9wqz_openstack(78bfcd03-74e4-4238-ae81-043bc04105cd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:22:57 crc kubenswrapper[4883]: E0310 09:22:57.319184 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-v9wqz" podUID="78bfcd03-74e4-4238-ae81-043bc04105cd" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.400818 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.451194 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" event={"ID":"631a0fc2-de6d-4778-bce2-46b69c306e44","Type":"ContainerDied","Data":"5064013cb3a4c978ab4c5b11395e9bcf9761e80e64e6533a75fe3d587b5a3d7c"} Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.451219 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:57 crc kubenswrapper[4883]: E0310 09:22:57.452814 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:11d4431e4af1735fbd9d425596f81dd62b0ca934d84d7c4e67902656c2b688d3\\\"\"" pod="openstack/placement-db-sync-v9wqz" podUID="78bfcd03-74e4-4238-ae81-043bc04105cd" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.538805 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-nb\") pod \"631a0fc2-de6d-4778-bce2-46b69c306e44\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.538894 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-svc\") pod \"631a0fc2-de6d-4778-bce2-46b69c306e44\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.539049 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-swift-storage-0\") pod \"631a0fc2-de6d-4778-bce2-46b69c306e44\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.539780 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-sb\") pod \"631a0fc2-de6d-4778-bce2-46b69c306e44\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.539822 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpsnd\" (UniqueName: \"kubernetes.io/projected/631a0fc2-de6d-4778-bce2-46b69c306e44-kube-api-access-dpsnd\") pod \"631a0fc2-de6d-4778-bce2-46b69c306e44\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.539845 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-config\") pod \"631a0fc2-de6d-4778-bce2-46b69c306e44\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.545357 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/631a0fc2-de6d-4778-bce2-46b69c306e44-kube-api-access-dpsnd" (OuterVolumeSpecName: "kube-api-access-dpsnd") pod "631a0fc2-de6d-4778-bce2-46b69c306e44" (UID: "631a0fc2-de6d-4778-bce2-46b69c306e44"). InnerVolumeSpecName "kube-api-access-dpsnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.575997 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "631a0fc2-de6d-4778-bce2-46b69c306e44" (UID: "631a0fc2-de6d-4778-bce2-46b69c306e44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.579556 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "631a0fc2-de6d-4778-bce2-46b69c306e44" (UID: "631a0fc2-de6d-4778-bce2-46b69c306e44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.580797 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "631a0fc2-de6d-4778-bce2-46b69c306e44" (UID: "631a0fc2-de6d-4778-bce2-46b69c306e44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.583232 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-config" (OuterVolumeSpecName: "config") pod "631a0fc2-de6d-4778-bce2-46b69c306e44" (UID: "631a0fc2-de6d-4778-bce2-46b69c306e44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.587450 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "631a0fc2-de6d-4778-bce2-46b69c306e44" (UID: "631a0fc2-de6d-4778-bce2-46b69c306e44"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.642458 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.642510 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.642524 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.642536 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.642550 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpsnd\" (UniqueName: \"kubernetes.io/projected/631a0fc2-de6d-4778-bce2-46b69c306e44-kube-api-access-dpsnd\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.642563 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.795897 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-scjqt"] Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.804395 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-scjqt"] Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.098356 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" path="/var/lib/kubelet/pods/631a0fc2-de6d-4778-bce2-46b69c306e44/volumes" Mar 10 09:22:58 crc kubenswrapper[4883]: E0310 09:22:58.573673 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 10 09:22:58 crc kubenswrapper[4883]: E0310 09:22:58.573947 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lhjm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-x2hf5_openstack(dc0b1d9d-7834-473a-a487-6f540c606706): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:22:58 crc kubenswrapper[4883]: E0310 09:22:58.576395 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-x2hf5" podUID="dc0b1d9d-7834-473a-a487-6f540c606706" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.632385 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.638209 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.671915 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-scripts\") pod \"a3de0c9e-752c-487f-934f-170386d5f462\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.672081 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24tkk\" (UniqueName: \"kubernetes.io/projected/a3de0c9e-752c-487f-934f-170386d5f462-kube-api-access-24tkk\") pod \"a3de0c9e-752c-487f-934f-170386d5f462\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.672129 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-config-data\") pod \"a3de0c9e-752c-487f-934f-170386d5f462\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.672189 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-config-data\") pod \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.672629 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-scripts" (OuterVolumeSpecName: "scripts") pod "a3de0c9e-752c-487f-934f-170386d5f462" (UID: "a3de0c9e-752c-487f-934f-170386d5f462"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.673243 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-config-data" (OuterVolumeSpecName: "config-data") pod "6dde4634-36a7-4142-99e2-9a43f0fbe3fd" (UID: "6dde4634-36a7-4142-99e2-9a43f0fbe3fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.673265 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-config-data" (OuterVolumeSpecName: "config-data") pod "a3de0c9e-752c-487f-934f-170386d5f462" (UID: "a3de0c9e-752c-487f-934f-170386d5f462"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.673383 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-scripts\") pod \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.673428 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3de0c9e-752c-487f-934f-170386d5f462-logs\") pod \"a3de0c9e-752c-487f-934f-170386d5f462\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.674273 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3de0c9e-752c-487f-934f-170386d5f462-logs" (OuterVolumeSpecName: "logs") pod "a3de0c9e-752c-487f-934f-170386d5f462" (UID: "a3de0c9e-752c-487f-934f-170386d5f462"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.674283 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-scripts" (OuterVolumeSpecName: "scripts") pod "6dde4634-36a7-4142-99e2-9a43f0fbe3fd" (UID: "6dde4634-36a7-4142-99e2-9a43f0fbe3fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.674377 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-logs\") pod \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.674543 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b49zt\" (UniqueName: \"kubernetes.io/projected/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-kube-api-access-b49zt\") pod \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.674809 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-logs" (OuterVolumeSpecName: "logs") pod "6dde4634-36a7-4142-99e2-9a43f0fbe3fd" (UID: "6dde4634-36a7-4142-99e2-9a43f0fbe3fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.675076 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3de0c9e-752c-487f-934f-170386d5f462-horizon-secret-key\") pod \"a3de0c9e-752c-487f-934f-170386d5f462\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.675120 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-horizon-secret-key\") pod \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.676281 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.676301 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.676311 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.676320 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.676328 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3de0c9e-752c-487f-934f-170386d5f462-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.676336 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.679560 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3de0c9e-752c-487f-934f-170386d5f462-kube-api-access-24tkk" (OuterVolumeSpecName: "kube-api-access-24tkk") pod "a3de0c9e-752c-487f-934f-170386d5f462" (UID: "a3de0c9e-752c-487f-934f-170386d5f462"). InnerVolumeSpecName "kube-api-access-24tkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.679705 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-kube-api-access-b49zt" (OuterVolumeSpecName: "kube-api-access-b49zt") pod "6dde4634-36a7-4142-99e2-9a43f0fbe3fd" (UID: "6dde4634-36a7-4142-99e2-9a43f0fbe3fd"). InnerVolumeSpecName "kube-api-access-b49zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.679946 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3de0c9e-752c-487f-934f-170386d5f462-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a3de0c9e-752c-487f-934f-170386d5f462" (UID: "a3de0c9e-752c-487f-934f-170386d5f462"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.680071 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6dde4634-36a7-4142-99e2-9a43f0fbe3fd" (UID: "6dde4634-36a7-4142-99e2-9a43f0fbe3fd"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.779431 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b49zt\" (UniqueName: \"kubernetes.io/projected/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-kube-api-access-b49zt\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.779469 4883 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3de0c9e-752c-487f-934f-170386d5f462-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.779499 4883 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.779508 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24tkk\" (UniqueName: \"kubernetes.io/projected/a3de0c9e-752c-487f-934f-170386d5f462-kube-api-access-24tkk\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: E0310 09:22:58.919856 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:57dfeeb1cb430ed73e6db471592cfb1a5f25d3d5c083f82d4a676f936978be81" Mar 10 09:22:58 crc kubenswrapper[4883]: E0310 09:22:58.920010 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:57dfeeb1cb430ed73e6db471592cfb1a5f25d3d5c083f82d4a676f936978be81,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n589h648h56h679h5c5h567h59bhd7hbbh59dh58bh545h5c6h99h7bh77h55fh56chcfh98h587h67h5d5h554hd7h547h57fh5c9h556h546h584h57bq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wmmc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(87232af6-dc87-4f68-8b1f-850fd98219a8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.965142 4883 scope.go:117] "RemoveContainer" containerID="49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.067995 4883 scope.go:117] "RemoveContainer" containerID="751aa4bf679ccd2fb9310821003d8071e637bbcc3be2f295875a2abfad364ce2" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.142267 4883 scope.go:117] "RemoveContainer" containerID="e2b7245fe34a086ba653b170ba754d14297a7a81b375fe409da9fc2787c69d3f" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.480413 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54485c5c-cpvdz" event={"ID":"6dde4634-36a7-4142-99e2-9a43f0fbe3fd","Type":"ContainerDied","Data":"c40847d7eb6f2a5c70938469d31c3d3e46cacefe9ad418b40434e9f2175988e6"} Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.480865 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.499360 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-nf489"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.508043 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jt8bs" event={"ID":"6d78560f-1b01-4ac1-9c36-109595422d78","Type":"ContainerStarted","Data":"655c18b53a1ce432bb921c39b01b2479f6ad37de70ab46a32b33f42c98fb125b"} Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.514969 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6889f87769-6j4vp" event={"ID":"a3de0c9e-752c-487f-934f-170386d5f462","Type":"ContainerDied","Data":"be7e77d7db4925cebe572b32a2b1ac5989560fd1627d674ac963510739fff6e0"} Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.518953 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n7f74"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.542976 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.554926 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-798c4d5785-ftwkg" event={"ID":"dedd724b-83f5-408a-b4d5-08adb5d71cc0","Type":"ContainerStarted","Data":"c9fb87caf677ab775fd09fb5b4c1268f29ff2c553286a49e90875b8415d4d3a9"} Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.555093 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-798c4d5785-ftwkg" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon-log" containerID="cri-o://c9fb87caf677ab775fd09fb5b4c1268f29ff2c553286a49e90875b8415d4d3a9" gracePeriod=30 Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.555217 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-798c4d5785-ftwkg" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon" containerID="cri-o://0e86e22acb39b181cf412775b1b1f806082c86689f0257c2525c32d95cd32424" gracePeriod=30 Mar 10 09:22:59 crc kubenswrapper[4883]: E0310 09:22:59.569872 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-x2hf5" podUID="dc0b1d9d-7834-473a-a487-6f540c606706" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.586802 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fcc9bbb48-lf4jb"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.605066 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-555c96ddb-t7tcm"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.622949 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54485c5c-cpvdz"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.627973 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54485c5c-cpvdz"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.629213 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-jt8bs" podStartSLOduration=2.686991563 podStartE2EDuration="27.629195334s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="2026-03-10 09:22:33.973707762 +0000 UTC m=+1140.228605651" lastFinishedPulling="2026-03-10 09:22:58.915911533 +0000 UTC m=+1165.170809422" observedRunningTime="2026-03-10 09:22:59.553152491 +0000 UTC m=+1165.808050380" watchObservedRunningTime="2026-03-10 09:22:59.629195334 +0000 UTC m=+1165.884093222" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.670391 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-798c4d5785-ftwkg" podStartSLOduration=2.38174907 podStartE2EDuration="27.670372651s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="2026-03-10 09:22:33.62300178 +0000 UTC m=+1139.877899669" lastFinishedPulling="2026-03-10 09:22:58.91162536 +0000 UTC m=+1165.166523250" observedRunningTime="2026-03-10 09:22:59.619037318 +0000 UTC m=+1165.873935207" watchObservedRunningTime="2026-03-10 09:22:59.670372651 +0000 UTC m=+1165.925270540" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.671027 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.689638 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6889f87769-6j4vp"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.693907 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6889f87769-6j4vp"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.698303 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bc48b486f-2j899"] Mar 10 09:22:59 crc kubenswrapper[4883]: W0310 09:22:59.729840 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77895d16_8ad3_4edb_ae91_d807afd499b3.slice/crio-3e2a2bca9130ffcd91c88897030c731f9414ca6f4f8e578189201420e38446dd WatchSource:0}: Error finding container 3e2a2bca9130ffcd91c88897030c731f9414ca6f4f8e578189201420e38446dd: Status 404 returned error can't find the container with id 3e2a2bca9130ffcd91c88897030c731f9414ca6f4f8e578189201420e38446dd Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.798757 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f6f8846bd-rdwfd"] Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.092936 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dde4634-36a7-4142-99e2-9a43f0fbe3fd" path="/var/lib/kubelet/pods/6dde4634-36a7-4142-99e2-9a43f0fbe3fd/volumes" Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.093490 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3de0c9e-752c-487f-934f-170386d5f462" path="/var/lib/kubelet/pods/a3de0c9e-752c-487f-934f-170386d5f462/volumes" Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.366706 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.576165 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-555c96ddb-t7tcm" event={"ID":"ef0598ad-c7ea-4645-b553-7d9028397156","Type":"ContainerStarted","Data":"174bee987ac064b45e697523f8e929df13b7b8a6d6f5713d0d159940e38b9e6c"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.576205 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-555c96ddb-t7tcm" event={"ID":"ef0598ad-c7ea-4645-b553-7d9028397156","Type":"ContainerStarted","Data":"97c9a096d73de1c3f1ffacb4309122cf185624906c4b45132ebef04bebf6385c"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.576214 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-555c96ddb-t7tcm" event={"ID":"ef0598ad-c7ea-4645-b553-7d9028397156","Type":"ContainerStarted","Data":"66d01928941e03c04f9704e829affc43d592c9977316f15f431dfc6ab92edf9c"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.590608 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bc48b486f-2j899" event={"ID":"77895d16-8ad3-4edb-ae91-d807afd499b3","Type":"ContainerStarted","Data":"6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.590663 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bc48b486f-2j899" event={"ID":"77895d16-8ad3-4edb-ae91-d807afd499b3","Type":"ContainerStarted","Data":"3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.590676 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bc48b486f-2j899" event={"ID":"77895d16-8ad3-4edb-ae91-d807afd499b3","Type":"ContainerStarted","Data":"3e2a2bca9130ffcd91c88897030c731f9414ca6f4f8e578189201420e38446dd"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.591263 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.600956 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-555c96ddb-t7tcm" podStartSLOduration=19.600934309 podStartE2EDuration="19.600934309s" podCreationTimestamp="2026-03-10 09:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:00.595955771 +0000 UTC m=+1166.850853660" watchObservedRunningTime="2026-03-10 09:23:00.600934309 +0000 UTC m=+1166.855832189" Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.602150 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6f8846bd-rdwfd" event={"ID":"de4a8a41-06f6-4d5a-939c-22eebc30b0d8","Type":"ContainerStarted","Data":"02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.602180 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6f8846bd-rdwfd" event={"ID":"de4a8a41-06f6-4d5a-939c-22eebc30b0d8","Type":"ContainerStarted","Data":"98d045da7fa92d2ea6ec832a583b37763ca71714b8c37b66a7d614c5c8099df1"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.604427 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7f74" event={"ID":"5acdd73a-9879-4507-8f6d-10e2ad8065e4","Type":"ContainerStarted","Data":"607df7cc400848b9303988905dd099e3a6ed13423fde585243a8fdb269315664"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.604463 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7f74" event={"ID":"5acdd73a-9879-4507-8f6d-10e2ad8065e4","Type":"ContainerStarted","Data":"0d148133502c63f130313a1a9a41570703ee3c780cfc640a21d5ed94368c1fc2"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.633483 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5bc48b486f-2j899" podStartSLOduration=4.633457808 podStartE2EDuration="4.633457808s" podCreationTimestamp="2026-03-10 09:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:00.611565757 +0000 UTC m=+1166.866463647" watchObservedRunningTime="2026-03-10 09:23:00.633457808 +0000 UTC m=+1166.888355698" Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.639651 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-n7f74" podStartSLOduration=17.639642782 podStartE2EDuration="17.639642782s" podCreationTimestamp="2026-03-10 09:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:00.635119813 +0000 UTC m=+1166.890017702" watchObservedRunningTime="2026-03-10 09:23:00.639642782 +0000 UTC m=+1166.894540681" Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.654327 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-798c4d5785-ftwkg" event={"ID":"dedd724b-83f5-408a-b4d5-08adb5d71cc0","Type":"ContainerStarted","Data":"0e86e22acb39b181cf412775b1b1f806082c86689f0257c2525c32d95cd32424"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.668349 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcc9bbb48-lf4jb" event={"ID":"a4909549-f2c4-45b0-a8f8-521302991297","Type":"ContainerStarted","Data":"ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.668378 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcc9bbb48-lf4jb" event={"ID":"a4909549-f2c4-45b0-a8f8-521302991297","Type":"ContainerStarted","Data":"5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.668390 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcc9bbb48-lf4jb" event={"ID":"a4909549-f2c4-45b0-a8f8-521302991297","Type":"ContainerStarted","Data":"023eb4b942e99478ddc5c2302dbb0ec5737ecdcfe04fb54667164182410590d3"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.673143 4883 generic.go:334] "Generic (PLEG): container finished" podID="bb1924b3-495e-4d89-a314-5cc86d567758" containerID="8a043f976dc0ec960bd4342407fe8ea99f6aca698feb5f8a45170e566caea1a7" exitCode=0 Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.673220 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-nf489" event={"ID":"bb1924b3-495e-4d89-a314-5cc86d567758","Type":"ContainerDied","Data":"8a043f976dc0ec960bd4342407fe8ea99f6aca698feb5f8a45170e566caea1a7"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.673251 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-nf489" event={"ID":"bb1924b3-495e-4d89-a314-5cc86d567758","Type":"ContainerStarted","Data":"6002e8450e8885431aac74b1842d182bc92a0a97fd9c5adce1cdbfe3e6e07596"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.685774 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37cf08e9-e871-4798-8990-1c80f7776d8f","Type":"ContainerStarted","Data":"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.685802 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37cf08e9-e871-4798-8990-1c80f7776d8f","Type":"ContainerStarted","Data":"05151d51868c64168fd42dc513563a4dce7dbaa3cd8c62f4564dbc730c66c6a3"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.689737 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f9967357-b98f-4e31-9934-f99669b31024","Type":"ContainerStarted","Data":"71b700d498775842c5c3b3c9b8a10f0292a828bd6a69eebba134bc667b2b5df6"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.716663 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5fcc9bbb48-lf4jb" podStartSLOduration=20.716649092 podStartE2EDuration="20.716649092s" podCreationTimestamp="2026-03-10 09:22:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:00.692790061 +0000 UTC m=+1166.947687950" watchObservedRunningTime="2026-03-10 09:23:00.716649092 +0000 UTC m=+1166.971546980" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.262084 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.262468 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.388197 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.395344 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.395439 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.712402 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37cf08e9-e871-4798-8990-1c80f7776d8f","Type":"ContainerStarted","Data":"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560"} Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.714123 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-log" containerID="cri-o://2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270" gracePeriod=30 Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.716684 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-httpd" containerID="cri-o://14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560" gracePeriod=30 Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.722170 4883 generic.go:334] "Generic (PLEG): container finished" podID="6d78560f-1b01-4ac1-9c36-109595422d78" containerID="655c18b53a1ce432bb921c39b01b2479f6ad37de70ab46a32b33f42c98fb125b" exitCode=0 Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.722253 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jt8bs" event={"ID":"6d78560f-1b01-4ac1-9c36-109595422d78","Type":"ContainerDied","Data":"655c18b53a1ce432bb921c39b01b2479f6ad37de70ab46a32b33f42c98fb125b"} Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.724841 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f9967357-b98f-4e31-9934-f99669b31024","Type":"ContainerStarted","Data":"feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731"} Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.733889 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6f8846bd-rdwfd" event={"ID":"de4a8a41-06f6-4d5a-939c-22eebc30b0d8","Type":"ContainerStarted","Data":"5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4"} Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.735303 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.739809 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=24.73979783 podStartE2EDuration="24.73979783s" podCreationTimestamp="2026-03-10 09:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:01.73260713 +0000 UTC m=+1167.987505020" watchObservedRunningTime="2026-03-10 09:23:01.73979783 +0000 UTC m=+1167.994695718" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.749529 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerStarted","Data":"859a5bbeb1589e12771278a11c1722a6122ae18255ec7752da9e54dc359a7b36"} Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.761566 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-nf489" event={"ID":"bb1924b3-495e-4d89-a314-5cc86d567758","Type":"ContainerStarted","Data":"abd0d4cdc5283d36393d37385d0b14c08a59b31c42729a63940bf76c4dcfaa0e"} Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.761603 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.773054 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f6f8846bd-rdwfd" podStartSLOduration=7.773044232 podStartE2EDuration="7.773044232s" podCreationTimestamp="2026-03-10 09:22:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:01.76760188 +0000 UTC m=+1168.022499769" watchObservedRunningTime="2026-03-10 09:23:01.773044232 +0000 UTC m=+1168.027942121" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.784741 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7859c7799c-nf489" podStartSLOduration=7.784721963 podStartE2EDuration="7.784721963s" podCreationTimestamp="2026-03-10 09:22:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:01.783961249 +0000 UTC m=+1168.038859138" watchObservedRunningTime="2026-03-10 09:23:01.784721963 +0000 UTC m=+1168.039619843" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.506432 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.683985 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-public-tls-certs\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.685547 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26kq2\" (UniqueName: \"kubernetes.io/projected/37cf08e9-e871-4798-8990-1c80f7776d8f-kube-api-access-26kq2\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.685861 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-config-data\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.685892 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-combined-ca-bundle\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.686010 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-logs\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.686063 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-httpd-run\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.686121 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-scripts\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.686148 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.686728 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-logs" (OuterVolumeSpecName: "logs") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.686836 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.690847 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.699727 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37cf08e9-e871-4798-8990-1c80f7776d8f-kube-api-access-26kq2" (OuterVolumeSpecName: "kube-api-access-26kq2") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "kube-api-access-26kq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.708617 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-scripts" (OuterVolumeSpecName: "scripts") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.709279 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.730224 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.756604 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-config-data" (OuterVolumeSpecName: "config-data") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.772148 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787552 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787574 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787593 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787603 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787611 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787628 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787636 4883 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787644 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26kq2\" (UniqueName: \"kubernetes.io/projected/37cf08e9-e871-4798-8990-1c80f7776d8f-kube-api-access-26kq2\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.790898 4883 generic.go:334] "Generic (PLEG): container finished" podID="5acdd73a-9879-4507-8f6d-10e2ad8065e4" containerID="607df7cc400848b9303988905dd099e3a6ed13423fde585243a8fdb269315664" exitCode=0 Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.790976 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7f74" event={"ID":"5acdd73a-9879-4507-8f6d-10e2ad8065e4","Type":"ContainerDied","Data":"607df7cc400848b9303988905dd099e3a6ed13423fde585243a8fdb269315664"} Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794463 4883 generic.go:334] "Generic (PLEG): container finished" podID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerID="14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560" exitCode=0 Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794514 4883 generic.go:334] "Generic (PLEG): container finished" podID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerID="2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270" exitCode=143 Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794566 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37cf08e9-e871-4798-8990-1c80f7776d8f","Type":"ContainerDied","Data":"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560"} Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794608 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37cf08e9-e871-4798-8990-1c80f7776d8f","Type":"ContainerDied","Data":"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270"} Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794621 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37cf08e9-e871-4798-8990-1c80f7776d8f","Type":"ContainerDied","Data":"05151d51868c64168fd42dc513563a4dce7dbaa3cd8c62f4564dbc730c66c6a3"} Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794641 4883 scope.go:117] "RemoveContainer" containerID="14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794776 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.812914 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.815717 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f9967357-b98f-4e31-9934-f99669b31024","Type":"ContainerStarted","Data":"93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032"} Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.840179 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.840167193 podStartE2EDuration="9.840167193s" podCreationTimestamp="2026-03-10 09:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:02.838798972 +0000 UTC m=+1169.093696872" watchObservedRunningTime="2026-03-10 09:23:02.840167193 +0000 UTC m=+1169.095065082" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.857564 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.880523 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887176 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:02 crc kubenswrapper[4883]: E0310 09:23:02.887599 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-log" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887618 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-log" Mar 10 09:23:02 crc kubenswrapper[4883]: E0310 09:23:02.887654 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="init" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887662 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="init" Mar 10 09:23:02 crc kubenswrapper[4883]: E0310 09:23:02.887685 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-httpd" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887692 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-httpd" Mar 10 09:23:02 crc kubenswrapper[4883]: E0310 09:23:02.887708 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887713 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887880 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-log" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887913 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887931 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-httpd" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.890159 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.890243 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.892184 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.913425 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.913718 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.992809 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993167 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993382 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-logs\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993506 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-scripts\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993571 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-config-data\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993662 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8mhb\" (UniqueName: \"kubernetes.io/projected/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-kube-api-access-z8mhb\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993690 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993719 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.056174 4883 scope.go:117] "RemoveContainer" containerID="2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095068 4883 scope.go:117] "RemoveContainer" containerID="14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095680 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: E0310 09:23:03.095741 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560\": container with ID starting with 14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560 not found: ID does not exist" containerID="14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095781 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095859 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-logs\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095904 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095909 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-scripts\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095946 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-config-data\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095986 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8mhb\" (UniqueName: \"kubernetes.io/projected/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-kube-api-access-z8mhb\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.096004 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.096029 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095769 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560"} err="failed to get container status \"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560\": rpc error: code = NotFound desc = could not find container \"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560\": container with ID starting with 14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560 not found: ID does not exist" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.099785 4883 scope.go:117] "RemoveContainer" containerID="2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.100485 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-logs\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.100753 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: E0310 09:23:03.101741 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270\": container with ID starting with 2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270 not found: ID does not exist" containerID="2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.101774 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270"} err="failed to get container status \"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270\": rpc error: code = NotFound desc = could not find container \"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270\": container with ID starting with 2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270 not found: ID does not exist" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.101794 4883 scope.go:117] "RemoveContainer" containerID="14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.102665 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.104153 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.104601 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560"} err="failed to get container status \"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560\": rpc error: code = NotFound desc = could not find container \"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560\": container with ID starting with 14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560 not found: ID does not exist" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.104747 4883 scope.go:117] "RemoveContainer" containerID="2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.105208 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270"} err="failed to get container status \"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270\": rpc error: code = NotFound desc = could not find container \"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270\": container with ID starting with 2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270 not found: ID does not exist" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.106639 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-config-data\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.114376 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-scripts\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.116227 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8mhb\" (UniqueName: \"kubernetes.io/projected/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-kube-api-access-z8mhb\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.131501 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.220389 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.222635 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.302022 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-db-sync-config-data\") pod \"6d78560f-1b01-4ac1-9c36-109595422d78\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.302134 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psns8\" (UniqueName: \"kubernetes.io/projected/6d78560f-1b01-4ac1-9c36-109595422d78-kube-api-access-psns8\") pod \"6d78560f-1b01-4ac1-9c36-109595422d78\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.302536 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-combined-ca-bundle\") pod \"6d78560f-1b01-4ac1-9c36-109595422d78\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.308614 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6d78560f-1b01-4ac1-9c36-109595422d78" (UID: "6d78560f-1b01-4ac1-9c36-109595422d78"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.313989 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d78560f-1b01-4ac1-9c36-109595422d78-kube-api-access-psns8" (OuterVolumeSpecName: "kube-api-access-psns8") pod "6d78560f-1b01-4ac1-9c36-109595422d78" (UID: "6d78560f-1b01-4ac1-9c36-109595422d78"). InnerVolumeSpecName "kube-api-access-psns8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.335855 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d78560f-1b01-4ac1-9c36-109595422d78" (UID: "6d78560f-1b01-4ac1-9c36-109595422d78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.405221 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.405245 4883 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.405255 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psns8\" (UniqueName: \"kubernetes.io/projected/6d78560f-1b01-4ac1-9c36-109595422d78-kube-api-access-psns8\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.740940 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.828754 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jt8bs" event={"ID":"6d78560f-1b01-4ac1-9c36-109595422d78","Type":"ContainerDied","Data":"1ecd75dacc64ce6638ff3cb4e1b982b79b4377e09b6132685f58b6c4877440c0"} Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.828800 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ecd75dacc64ce6638ff3cb4e1b982b79b4377e09b6132685f58b6c4877440c0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.828905 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.049834 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-76f6589d69-9q47v"] Mar 10 09:23:04 crc kubenswrapper[4883]: E0310 09:23:04.050355 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d78560f-1b01-4ac1-9c36-109595422d78" containerName="barbican-db-sync" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.050372 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d78560f-1b01-4ac1-9c36-109595422d78" containerName="barbican-db-sync" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.050624 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d78560f-1b01-4ac1-9c36-109595422d78" containerName="barbican-db-sync" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.051643 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.061317 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.061714 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.061819 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-q2mjf" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.065771 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5d58859d7d-76mv6"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.067581 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.072992 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.146901 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.149411 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13211306-6813-40e0-91c2-5f8ac43968c6-logs\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.149579 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-combined-ca-bundle\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.149674 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data-custom\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.149926 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjg75\" (UniqueName: \"kubernetes.io/projected/13211306-6813-40e0-91c2-5f8ac43968c6-kube-api-access-jjg75\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.194918 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" path="/var/lib/kubelet/pods/37cf08e9-e871-4798-8990-1c80f7776d8f/volumes" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.195818 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.195848 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.195862 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76f6589d69-9q47v"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.195879 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d58859d7d-76mv6"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.227062 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.246886 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.253054 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjg75\" (UniqueName: \"kubernetes.io/projected/13211306-6813-40e0-91c2-5f8ac43968c6-kube-api-access-jjg75\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.253495 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.253543 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data-custom\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.253728 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.254138 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13211306-6813-40e0-91c2-5f8ac43968c6-logs\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.254376 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b0d09-6aa9-427f-8b4b-209bb2c6707b-logs\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.254782 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-combined-ca-bundle\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.255246 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data-custom\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.255277 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbzvt\" (UniqueName: \"kubernetes.io/projected/329b0d09-6aa9-427f-8b4b-209bb2c6707b-kube-api-access-nbzvt\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.255507 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-combined-ca-bundle\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.258642 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13211306-6813-40e0-91c2-5f8ac43968c6-logs\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.264266 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-combined-ca-bundle\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.264691 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.269526 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-nf489"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.269827 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7859c7799c-nf489" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" containerName="dnsmasq-dns" containerID="cri-o://abd0d4cdc5283d36393d37385d0b14c08a59b31c42729a63940bf76c4dcfaa0e" gracePeriod=10 Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.278018 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data-custom\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.279284 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjg75\" (UniqueName: \"kubernetes.io/projected/13211306-6813-40e0-91c2-5f8ac43968c6-kube-api-access-jjg75\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.292516 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tgr6b"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.294313 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.300921 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tgr6b"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.306396 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-857bd77984-4wnsb"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.307730 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.310315 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.314965 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-857bd77984-4wnsb"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.358065 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbzvt\" (UniqueName: \"kubernetes.io/projected/329b0d09-6aa9-427f-8b4b-209bb2c6707b-kube-api-access-nbzvt\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.358391 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-combined-ca-bundle\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.358545 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.358599 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data-custom\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.358681 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b0d09-6aa9-427f-8b4b-209bb2c6707b-logs\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.359500 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b0d09-6aa9-427f-8b4b-209bb2c6707b-logs\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.364830 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-combined-ca-bundle\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.366328 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.366985 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data-custom\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.379158 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbzvt\" (UniqueName: \"kubernetes.io/projected/329b0d09-6aa9-427f-8b4b-209bb2c6707b-kube-api-access-nbzvt\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.429316 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.460686 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.460741 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.460768 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data-custom\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.460953 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461022 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4rhj\" (UniqueName: \"kubernetes.io/projected/627e20df-c100-4dac-b344-efda8eda195a-kube-api-access-h4rhj\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461042 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-config\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461143 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627e20df-c100-4dac-b344-efda8eda195a-logs\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461168 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-combined-ca-bundle\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461196 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461251 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461277 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmx5g\" (UniqueName: \"kubernetes.io/projected/ce11c091-db9b-47fb-8427-dcaa2585a4c7-kube-api-access-pmx5g\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.467431 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.562887 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.562973 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4rhj\" (UniqueName: \"kubernetes.io/projected/627e20df-c100-4dac-b344-efda8eda195a-kube-api-access-h4rhj\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.562994 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-config\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563070 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627e20df-c100-4dac-b344-efda8eda195a-logs\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563094 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-combined-ca-bundle\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563122 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563166 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563190 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmx5g\" (UniqueName: \"kubernetes.io/projected/ce11c091-db9b-47fb-8427-dcaa2585a4c7-kube-api-access-pmx5g\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563303 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563330 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563348 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data-custom\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.564260 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.564334 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627e20df-c100-4dac-b344-efda8eda195a-logs\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.564335 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.564619 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-config\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.565040 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.565243 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.567583 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-combined-ca-bundle\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.570631 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data-custom\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.571287 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.579676 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4rhj\" (UniqueName: \"kubernetes.io/projected/627e20df-c100-4dac-b344-efda8eda195a-kube-api-access-h4rhj\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.579817 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmx5g\" (UniqueName: \"kubernetes.io/projected/ce11c091-db9b-47fb-8427-dcaa2585a4c7-kube-api-access-pmx5g\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.749736 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.756353 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.838686 4883 generic.go:334] "Generic (PLEG): container finished" podID="bb1924b3-495e-4d89-a314-5cc86d567758" containerID="abd0d4cdc5283d36393d37385d0b14c08a59b31c42729a63940bf76c4dcfaa0e" exitCode=0 Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.839449 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-nf489" event={"ID":"bb1924b3-495e-4d89-a314-5cc86d567758","Type":"ContainerDied","Data":"abd0d4cdc5283d36393d37385d0b14c08a59b31c42729a63940bf76c4dcfaa0e"} Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.839538 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.839553 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.545827 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-b5b6f75db-kgz55"] Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.549856 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.555874 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.556160 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.563354 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b5b6f75db-kgz55"] Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728511 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728575 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-combined-ca-bundle\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728674 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-public-tls-certs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728695 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-internal-tls-certs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728726 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ac116f-f773-4b3f-a508-bc304668da18-logs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728832 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data-custom\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728854 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gb9m\" (UniqueName: \"kubernetes.io/projected/41ac116f-f773-4b3f-a508-bc304668da18-kube-api-access-4gb9m\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.831985 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-public-tls-certs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.832294 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-internal-tls-certs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.832374 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ac116f-f773-4b3f-a508-bc304668da18-logs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.832539 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data-custom\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.832632 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gb9m\" (UniqueName: \"kubernetes.io/projected/41ac116f-f773-4b3f-a508-bc304668da18-kube-api-access-4gb9m\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.832742 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.832814 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-combined-ca-bundle\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.833410 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ac116f-f773-4b3f-a508-bc304668da18-logs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.839826 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-internal-tls-certs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.840718 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-combined-ca-bundle\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.840897 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data-custom\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.841400 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.842130 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-public-tls-certs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.853145 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gb9m\" (UniqueName: \"kubernetes.io/projected/41ac116f-f773-4b3f-a508-bc304668da18-kube-api-access-4gb9m\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.874993 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:07 crc kubenswrapper[4883]: I0310 09:23:07.878413 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3","Type":"ContainerStarted","Data":"4d280b27ae76beef2731a3863818dd720d9ca5f105e0f710f7b3f7d025052c9f"} Mar 10 09:23:07 crc kubenswrapper[4883]: I0310 09:23:07.880493 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7f74" event={"ID":"5acdd73a-9879-4507-8f6d-10e2ad8065e4","Type":"ContainerDied","Data":"0d148133502c63f130313a1a9a41570703ee3c780cfc640a21d5ed94368c1fc2"} Mar 10 09:23:07 crc kubenswrapper[4883]: I0310 09:23:07.880730 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d148133502c63f130313a1a9a41570703ee3c780cfc640a21d5ed94368c1fc2" Mar 10 09:23:07 crc kubenswrapper[4883]: I0310 09:23:07.883617 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-nf489" event={"ID":"bb1924b3-495e-4d89-a314-5cc86d567758","Type":"ContainerDied","Data":"6002e8450e8885431aac74b1842d182bc92a0a97fd9c5adce1cdbfe3e6e07596"} Mar 10 09:23:07 crc kubenswrapper[4883]: I0310 09:23:07.883641 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6002e8450e8885431aac74b1842d182bc92a0a97fd9c5adce1cdbfe3e6e07596" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.149054 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.162693 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274322 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-credential-keys\") pod \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274437 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-combined-ca-bundle\") pod \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274497 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-nb\") pod \"bb1924b3-495e-4d89-a314-5cc86d567758\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274515 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mmp6\" (UniqueName: \"kubernetes.io/projected/5acdd73a-9879-4507-8f6d-10e2ad8065e4-kube-api-access-7mmp6\") pod \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274540 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-sb\") pod \"bb1924b3-495e-4d89-a314-5cc86d567758\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274580 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-scripts\") pod \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274608 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-swift-storage-0\") pod \"bb1924b3-495e-4d89-a314-5cc86d567758\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274681 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-config\") pod \"bb1924b3-495e-4d89-a314-5cc86d567758\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274696 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27k8t\" (UniqueName: \"kubernetes.io/projected/bb1924b3-495e-4d89-a314-5cc86d567758-kube-api-access-27k8t\") pod \"bb1924b3-495e-4d89-a314-5cc86d567758\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.275095 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-fernet-keys\") pod \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.275147 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-svc\") pod \"bb1924b3-495e-4d89-a314-5cc86d567758\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.275228 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-config-data\") pod \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.283833 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5acdd73a-9879-4507-8f6d-10e2ad8065e4-kube-api-access-7mmp6" (OuterVolumeSpecName: "kube-api-access-7mmp6") pod "5acdd73a-9879-4507-8f6d-10e2ad8065e4" (UID: "5acdd73a-9879-4507-8f6d-10e2ad8065e4"). InnerVolumeSpecName "kube-api-access-7mmp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.289115 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5acdd73a-9879-4507-8f6d-10e2ad8065e4" (UID: "5acdd73a-9879-4507-8f6d-10e2ad8065e4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.292956 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-scripts" (OuterVolumeSpecName: "scripts") pod "5acdd73a-9879-4507-8f6d-10e2ad8065e4" (UID: "5acdd73a-9879-4507-8f6d-10e2ad8065e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.297677 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5acdd73a-9879-4507-8f6d-10e2ad8065e4" (UID: "5acdd73a-9879-4507-8f6d-10e2ad8065e4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.304394 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb1924b3-495e-4d89-a314-5cc86d567758-kube-api-access-27k8t" (OuterVolumeSpecName: "kube-api-access-27k8t") pod "bb1924b3-495e-4d89-a314-5cc86d567758" (UID: "bb1924b3-495e-4d89-a314-5cc86d567758"). InnerVolumeSpecName "kube-api-access-27k8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.335337 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb1924b3-495e-4d89-a314-5cc86d567758" (UID: "bb1924b3-495e-4d89-a314-5cc86d567758"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.371990 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb1924b3-495e-4d89-a314-5cc86d567758" (UID: "bb1924b3-495e-4d89-a314-5cc86d567758"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380102 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb1924b3-495e-4d89-a314-5cc86d567758" (UID: "bb1924b3-495e-4d89-a314-5cc86d567758"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380597 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380623 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mmp6\" (UniqueName: \"kubernetes.io/projected/5acdd73a-9879-4507-8f6d-10e2ad8065e4-kube-api-access-7mmp6\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380635 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380644 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380654 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380663 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27k8t\" (UniqueName: \"kubernetes.io/projected/bb1924b3-495e-4d89-a314-5cc86d567758-kube-api-access-27k8t\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380677 4883 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380686 4883 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.383049 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-config" (OuterVolumeSpecName: "config") pod "bb1924b3-495e-4d89-a314-5cc86d567758" (UID: "bb1924b3-495e-4d89-a314-5cc86d567758"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.408885 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb1924b3-495e-4d89-a314-5cc86d567758" (UID: "bb1924b3-495e-4d89-a314-5cc86d567758"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.411303 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-config-data" (OuterVolumeSpecName: "config-data") pod "5acdd73a-9879-4507-8f6d-10e2ad8065e4" (UID: "5acdd73a-9879-4507-8f6d-10e2ad8065e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.412854 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5acdd73a-9879-4507-8f6d-10e2ad8065e4" (UID: "5acdd73a-9879-4507-8f6d-10e2ad8065e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.462433 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tgr6b"] Mar 10 09:23:08 crc kubenswrapper[4883]: W0310 09:23:08.466795 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce11c091_db9b_47fb_8427_dcaa2585a4c7.slice/crio-f2db0f853d6e758fe5cee704d792086fd294dc16742d1ad01bde933a91c3bcaa WatchSource:0}: Error finding container f2db0f853d6e758fe5cee704d792086fd294dc16742d1ad01bde933a91c3bcaa: Status 404 returned error can't find the container with id f2db0f853d6e758fe5cee704d792086fd294dc16742d1ad01bde933a91c3bcaa Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.483981 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.484009 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.484019 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.484028 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.554623 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-857bd77984-4wnsb"] Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.565284 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b5b6f75db-kgz55"] Mar 10 09:23:08 crc kubenswrapper[4883]: W0310 09:23:08.581434 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod627e20df_c100_4dac_b344_efda8eda195a.slice/crio-c8cc80979c9fb422fb6df0d72e23560faaf12e9f1fbf0fcaea70b7f181ea66ed WatchSource:0}: Error finding container c8cc80979c9fb422fb6df0d72e23560faaf12e9f1fbf0fcaea70b7f181ea66ed: Status 404 returned error can't find the container with id c8cc80979c9fb422fb6df0d72e23560faaf12e9f1fbf0fcaea70b7f181ea66ed Mar 10 09:23:08 crc kubenswrapper[4883]: W0310 09:23:08.582313 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41ac116f_f773_4b3f_a508_bc304668da18.slice/crio-867dc81634d3cab6b7d62fe7b7bea4736f522c76e9c10f3a681cf74c0480a8db WatchSource:0}: Error finding container 867dc81634d3cab6b7d62fe7b7bea4736f522c76e9c10f3a681cf74c0480a8db: Status 404 returned error can't find the container with id 867dc81634d3cab6b7d62fe7b7bea4736f522c76e9c10f3a681cf74c0480a8db Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.691360 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d58859d7d-76mv6"] Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.714269 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76f6589d69-9q47v"] Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.843081 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.869746 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.975420 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76f6589d69-9q47v" event={"ID":"13211306-6813-40e0-91c2-5f8ac43968c6","Type":"ContainerStarted","Data":"840eb0e983eac3e58e23b21ee91762a03097be568c06b5951bfe32f90ffa8f08"} Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.992620 4883 generic.go:334] "Generic (PLEG): container finished" podID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerID="611c5c4793d5a469fa00a11e611df52aec3fea84115f5f16327469e87284b34b" exitCode=0 Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.992684 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" event={"ID":"ce11c091-db9b-47fb-8427-dcaa2585a4c7","Type":"ContainerDied","Data":"611c5c4793d5a469fa00a11e611df52aec3fea84115f5f16327469e87284b34b"} Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.992705 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" event={"ID":"ce11c091-db9b-47fb-8427-dcaa2585a4c7","Type":"ContainerStarted","Data":"f2db0f853d6e758fe5cee704d792086fd294dc16742d1ad01bde933a91c3bcaa"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.008618 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" event={"ID":"329b0d09-6aa9-427f-8b4b-209bb2c6707b","Type":"ContainerStarted","Data":"e6c22b5b53507d702e074cf63768ed9e237d5a5c16f54e7d10767483ac8e989f"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.023707 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3","Type":"ContainerStarted","Data":"9942d2bf12f93ebc9815c517a342ac88745bd4ad2c848364cca8746de3f97630"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.031635 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-857bd77984-4wnsb" event={"ID":"627e20df-c100-4dac-b344-efda8eda195a","Type":"ContainerStarted","Data":"4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.031694 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-857bd77984-4wnsb" event={"ID":"627e20df-c100-4dac-b344-efda8eda195a","Type":"ContainerStarted","Data":"c8cc80979c9fb422fb6df0d72e23560faaf12e9f1fbf0fcaea70b7f181ea66ed"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.045824 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerStarted","Data":"68f801eebbf66b42a914bc4f687373317ee14f9ee6dba99ea4dc4fb9a1f0310b"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.075711 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.077755 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b5b6f75db-kgz55" event={"ID":"41ac116f-f773-4b3f-a508-bc304668da18","Type":"ContainerStarted","Data":"f44137001e9de9273d72347a19e670685324d63a7ae12a0a99d5b0aa11da1c3a"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.077793 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b5b6f75db-kgz55" event={"ID":"41ac116f-f773-4b3f-a508-bc304668da18","Type":"ContainerStarted","Data":"867dc81634d3cab6b7d62fe7b7bea4736f522c76e9c10f3a681cf74c0480a8db"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.078837 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.194408 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-nf489"] Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.206681 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-nf489"] Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.246247 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-744f4576f6-kglt9"] Mar 10 09:23:09 crc kubenswrapper[4883]: E0310 09:23:09.246783 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5acdd73a-9879-4507-8f6d-10e2ad8065e4" containerName="keystone-bootstrap" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.246806 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="5acdd73a-9879-4507-8f6d-10e2ad8065e4" containerName="keystone-bootstrap" Mar 10 09:23:09 crc kubenswrapper[4883]: E0310 09:23:09.246820 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" containerName="dnsmasq-dns" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.246827 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" containerName="dnsmasq-dns" Mar 10 09:23:09 crc kubenswrapper[4883]: E0310 09:23:09.246838 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" containerName="init" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.246844 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" containerName="init" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.247087 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="5acdd73a-9879-4507-8f6d-10e2ad8065e4" containerName="keystone-bootstrap" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.247111 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" containerName="dnsmasq-dns" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.247847 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.255868 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-744f4576f6-kglt9"] Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.258151 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.258251 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.258342 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.258531 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.258771 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-92kmh" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.258893 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405149 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-scripts\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405189 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-config-data\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405324 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-combined-ca-bundle\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405364 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-internal-tls-certs\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405456 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-public-tls-certs\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405676 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-fernet-keys\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405779 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-credential-keys\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405886 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnv92\" (UniqueName: \"kubernetes.io/projected/c6effa97-6f88-4706-98bc-b51af01bd993-kube-api-access-mnv92\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507184 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-scripts\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507236 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-config-data\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507292 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-combined-ca-bundle\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507316 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-internal-tls-certs\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507348 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-public-tls-certs\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507403 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-fernet-keys\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507444 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-credential-keys\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507507 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnv92\" (UniqueName: \"kubernetes.io/projected/c6effa97-6f88-4706-98bc-b51af01bd993-kube-api-access-mnv92\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.516790 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-scripts\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.516997 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-config-data\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.520826 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-combined-ca-bundle\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.523898 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-credential-keys\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.525669 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-public-tls-certs\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.526959 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-fernet-keys\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.531020 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnv92\" (UniqueName: \"kubernetes.io/projected/c6effa97-6f88-4706-98bc-b51af01bd993-kube-api-access-mnv92\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.532868 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-internal-tls-certs\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.576598 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.033175 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-744f4576f6-kglt9"] Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.092998 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" path="/var/lib/kubelet/pods/bb1924b3-495e-4d89-a314-5cc86d567758/volumes" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.096931 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b5b6f75db-kgz55" event={"ID":"41ac116f-f773-4b3f-a508-bc304668da18","Type":"ContainerStarted","Data":"7bfade1d342fc1fec145c040e3bb1b0fda95a95f92ce0479cba528ed21f45abc"} Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.098145 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.098181 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.101415 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" event={"ID":"ce11c091-db9b-47fb-8427-dcaa2585a4c7","Type":"ContainerStarted","Data":"25d1184c834395ecc1ce637374da9092da7e27ecf8a8400d81776463506396e2"} Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.101924 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.112848 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3","Type":"ContainerStarted","Data":"a4d2619e8455e83dda94f13bca3f4af0a00dbc8f602e8d4ef336ec6c1e921f82"} Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.121831 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-857bd77984-4wnsb" event={"ID":"627e20df-c100-4dac-b344-efda8eda195a","Type":"ContainerStarted","Data":"7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c"} Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.122379 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.122415 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.129103 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-b5b6f75db-kgz55" podStartSLOduration=4.129081758 podStartE2EDuration="4.129081758s" podCreationTimestamp="2026-03-10 09:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:10.120226539 +0000 UTC m=+1176.375124427" watchObservedRunningTime="2026-03-10 09:23:10.129081758 +0000 UTC m=+1176.383979646" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.147135 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" podStartSLOduration=6.147112627 podStartE2EDuration="6.147112627s" podCreationTimestamp="2026-03-10 09:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:10.139415413 +0000 UTC m=+1176.394313302" watchObservedRunningTime="2026-03-10 09:23:10.147112627 +0000 UTC m=+1176.402010516" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.178952 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.178923483 podStartE2EDuration="8.178923483s" podCreationTimestamp="2026-03-10 09:23:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:10.165752426 +0000 UTC m=+1176.420650315" watchObservedRunningTime="2026-03-10 09:23:10.178923483 +0000 UTC m=+1176.433821372" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.202395 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-857bd77984-4wnsb" podStartSLOduration=6.202370357 podStartE2EDuration="6.202370357s" podCreationTimestamp="2026-03-10 09:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:10.190899275 +0000 UTC m=+1176.445797155" watchObservedRunningTime="2026-03-10 09:23:10.202370357 +0000 UTC m=+1176.457268245" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.148049 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76f6589d69-9q47v" event={"ID":"13211306-6813-40e0-91c2-5f8ac43968c6","Type":"ContainerStarted","Data":"8695ad4b4edf6d710e9f86e52079c0ebcb1312875a2b7956f23e1e79e035e665"} Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.148370 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76f6589d69-9q47v" event={"ID":"13211306-6813-40e0-91c2-5f8ac43968c6","Type":"ContainerStarted","Data":"b1ec206adf1217262c3bfee94f02b30c425dada138bf01911ffc99b632986ffe"} Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.152494 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" event={"ID":"329b0d09-6aa9-427f-8b4b-209bb2c6707b","Type":"ContainerStarted","Data":"f3498d2162eafd25a3496fdf4e5a77ae8f5d492b8653875841a51cf1fa9f94b7"} Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.154930 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-744f4576f6-kglt9" event={"ID":"c6effa97-6f88-4706-98bc-b51af01bd993","Type":"ContainerStarted","Data":"6139fe2e637ef029d92f45bf936438fe03f3dfbd08565e1cb11592cee96f67f5"} Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.154959 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-744f4576f6-kglt9" event={"ID":"c6effa97-6f88-4706-98bc-b51af01bd993","Type":"ContainerStarted","Data":"66165c2a4a853e9120bb23d143e2a8d0459c59cf8b3c081ac55da2c5c1ff9daf"} Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.173631 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-76f6589d69-9q47v" podStartSLOduration=6.473579475 podStartE2EDuration="8.173611013s" podCreationTimestamp="2026-03-10 09:23:03 +0000 UTC" firstStartedPulling="2026-03-10 09:23:08.731346506 +0000 UTC m=+1174.986244384" lastFinishedPulling="2026-03-10 09:23:10.431378034 +0000 UTC m=+1176.686275922" observedRunningTime="2026-03-10 09:23:11.166838562 +0000 UTC m=+1177.421736451" watchObservedRunningTime="2026-03-10 09:23:11.173611013 +0000 UTC m=+1177.428508901" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.196405 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-744f4576f6-kglt9" podStartSLOduration=2.196391429 podStartE2EDuration="2.196391429s" podCreationTimestamp="2026-03-10 09:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:11.18875565 +0000 UTC m=+1177.443653539" watchObservedRunningTime="2026-03-10 09:23:11.196391429 +0000 UTC m=+1177.451289318" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.279553 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5fcc9bbb48-lf4jb" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.312113 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5d48955d69-pvbn8"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.313685 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.329865 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.331040 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.349947 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-config-data-custom\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.350033 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221490bc-406a-436f-8705-66106ed6bbe0-logs\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.350071 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-combined-ca-bundle\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.350092 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb2vv\" (UniqueName: \"kubernetes.io/projected/221490bc-406a-436f-8705-66106ed6bbe0-kube-api-access-lb2vv\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.350131 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-config-data\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.356648 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d48955d69-pvbn8"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.369152 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.399192 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-555c96ddb-t7tcm" podUID="ef0598ad-c7ea-4645-b553-7d9028397156" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.419873 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-857bd77984-4wnsb"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.452197 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-config-data-custom\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.452352 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-config-data-custom\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.452501 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-config-data\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.452730 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqck4\" (UniqueName: \"kubernetes.io/projected/dce7df3b-5f31-4732-8d27-8e06dc07824d-kube-api-access-nqck4\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.452852 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221490bc-406a-436f-8705-66106ed6bbe0-logs\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.452971 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dce7df3b-5f31-4732-8d27-8e06dc07824d-logs\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.453067 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-combined-ca-bundle\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.453172 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb2vv\" (UniqueName: \"kubernetes.io/projected/221490bc-406a-436f-8705-66106ed6bbe0-kube-api-access-lb2vv\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.453275 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-combined-ca-bundle\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.453218 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221490bc-406a-436f-8705-66106ed6bbe0-logs\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.453464 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-config-data\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.458225 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b55764b68-l794s"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.462922 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-config-data-custom\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.468501 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-config-data\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.472917 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-combined-ca-bundle\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.475047 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.476105 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb2vv\" (UniqueName: \"kubernetes.io/projected/221490bc-406a-436f-8705-66106ed6bbe0-kube-api-access-lb2vv\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.486998 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b55764b68-l794s"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555234 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqck4\" (UniqueName: \"kubernetes.io/projected/dce7df3b-5f31-4732-8d27-8e06dc07824d-kube-api-access-nqck4\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555283 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-combined-ca-bundle\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555318 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-internal-tls-certs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555350 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dce7df3b-5f31-4732-8d27-8e06dc07824d-logs\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555393 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-combined-ca-bundle\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555488 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd5mp\" (UniqueName: \"kubernetes.io/projected/17a46674-c6ec-4128-8285-d71c228d11c8-kube-api-access-sd5mp\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555532 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-config-data-custom\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555703 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-public-tls-certs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555796 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-config-data\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555840 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-config-data-custom\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555875 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-config-data\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.556009 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17a46674-c6ec-4128-8285-d71c228d11c8-logs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.556203 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dce7df3b-5f31-4732-8d27-8e06dc07824d-logs\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.559961 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-combined-ca-bundle\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.561073 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-config-data-custom\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.566487 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-config-data\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.575902 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqck4\" (UniqueName: \"kubernetes.io/projected/dce7df3b-5f31-4732-8d27-8e06dc07824d-kube-api-access-nqck4\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.650645 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658432 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-config-data\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658483 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-config-data-custom\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658602 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17a46674-c6ec-4128-8285-d71c228d11c8-logs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-combined-ca-bundle\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658731 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-internal-tls-certs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658881 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd5mp\" (UniqueName: \"kubernetes.io/projected/17a46674-c6ec-4128-8285-d71c228d11c8-kube-api-access-sd5mp\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658951 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-public-tls-certs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658990 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17a46674-c6ec-4128-8285-d71c228d11c8-logs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.663056 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.663194 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-config-data-custom\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.664993 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-public-tls-certs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.665355 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-internal-tls-certs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.666724 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-config-data\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.668201 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-combined-ca-bundle\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.672439 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd5mp\" (UniqueName: \"kubernetes.io/projected/17a46674-c6ec-4128-8285-d71c228d11c8-kube-api-access-sd5mp\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.803958 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.179350 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d48955d69-pvbn8"] Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.194458 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d48955d69-pvbn8" event={"ID":"221490bc-406a-436f-8705-66106ed6bbe0","Type":"ContainerStarted","Data":"b09a865ddd7a1b78c9c31ffc2f1b2e50541a572d4caa8b73de593c885e1724d8"} Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.201157 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" event={"ID":"329b0d09-6aa9-427f-8b4b-209bb2c6707b","Type":"ContainerStarted","Data":"d20e2851ee81ff0d94c886dd8f844a7b533eac90fb46cb8579dbbc2d2478bfbb"} Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.221451 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v9wqz" event={"ID":"78bfcd03-74e4-4238-ae81-043bc04105cd","Type":"ContainerStarted","Data":"447fb8e21bbb8bc037b2913c14d7e94664156b41e117d80c9e5a7c41f589d745"} Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.221542 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch"] Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.223630 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.227154 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" podStartSLOduration=6.080943312 podStartE2EDuration="8.227139247s" podCreationTimestamp="2026-03-10 09:23:04 +0000 UTC" firstStartedPulling="2026-03-10 09:23:08.731889469 +0000 UTC m=+1174.986787348" lastFinishedPulling="2026-03-10 09:23:10.878085404 +0000 UTC m=+1177.132983283" observedRunningTime="2026-03-10 09:23:12.22028404 +0000 UTC m=+1178.475181930" watchObservedRunningTime="2026-03-10 09:23:12.227139247 +0000 UTC m=+1178.482037137" Mar 10 09:23:12 crc kubenswrapper[4883]: W0310 09:23:12.229534 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddce7df3b_5f31_4732_8d27_8e06dc07824d.slice/crio-09ad10ec780618f78419872c78b5eaf33bd53b32422fe584bbf68183e33f6331 WatchSource:0}: Error finding container 09ad10ec780618f78419872c78b5eaf33bd53b32422fe584bbf68183e33f6331: Status 404 returned error can't find the container with id 09ad10ec780618f78419872c78b5eaf33bd53b32422fe584bbf68183e33f6331 Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.238334 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-v9wqz" podStartSLOduration=2.763724858 podStartE2EDuration="40.238317757s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="2026-03-10 09:22:34.092652535 +0000 UTC m=+1140.347550414" lastFinishedPulling="2026-03-10 09:23:11.567245424 +0000 UTC m=+1177.822143313" observedRunningTime="2026-03-10 09:23:12.23613474 +0000 UTC m=+1178.491032629" watchObservedRunningTime="2026-03-10 09:23:12.238317757 +0000 UTC m=+1178.493215645" Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.314244 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b55764b68-l794s"] Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.220781 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.221094 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.231317 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d48955d69-pvbn8" event={"ID":"221490bc-406a-436f-8705-66106ed6bbe0","Type":"ContainerStarted","Data":"830b49df936b02d19e6ae120338dd8e00727b41bd2ed719757de5b5c9c91bc0c"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.231371 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d48955d69-pvbn8" event={"ID":"221490bc-406a-436f-8705-66106ed6bbe0","Type":"ContainerStarted","Data":"293426d63ccc27304336f106c307c54dd630999057afb9451e63075fe697aa48"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.233898 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b55764b68-l794s" event={"ID":"17a46674-c6ec-4128-8285-d71c228d11c8","Type":"ContainerStarted","Data":"7d335d4784197fada2572dc8909148d75e141ba92964e9663bfde49a3713142d"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.234160 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b55764b68-l794s" event={"ID":"17a46674-c6ec-4128-8285-d71c228d11c8","Type":"ContainerStarted","Data":"18e716d7c9c21fbf104b169821c50a69b6dce5fd0637a0d22aa31622a442a36c"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.234194 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b55764b68-l794s" event={"ID":"17a46674-c6ec-4128-8285-d71c228d11c8","Type":"ContainerStarted","Data":"62af63d0f1e089100b9be851d041522e73589ef25a611d1597ee11eeca30449a"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.236188 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" event={"ID":"dce7df3b-5f31-4732-8d27-8e06dc07824d","Type":"ContainerStarted","Data":"11eba696e00d12ccbbb3d6ff0afa1f69ee6059e203e2827ba694d408742d8f2c"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.236215 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" event={"ID":"dce7df3b-5f31-4732-8d27-8e06dc07824d","Type":"ContainerStarted","Data":"1bb3b108ebf5b9acbfe8dce08e655bdccc54926cdeb51524044f4bcc719141e1"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.236245 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" event={"ID":"dce7df3b-5f31-4732-8d27-8e06dc07824d","Type":"ContainerStarted","Data":"09ad10ec780618f78419872c78b5eaf33bd53b32422fe584bbf68183e33f6331"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.236416 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-857bd77984-4wnsb" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api-log" containerID="cri-o://4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1" gracePeriod=30 Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.236597 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-857bd77984-4wnsb" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api" containerID="cri-o://7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c" gracePeriod=30 Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.256496 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5d48955d69-pvbn8" podStartSLOduration=2.25510016 podStartE2EDuration="2.25510016s" podCreationTimestamp="2026-03-10 09:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:13.246161556 +0000 UTC m=+1179.501059444" watchObservedRunningTime="2026-03-10 09:23:13.25510016 +0000 UTC m=+1179.509998050" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.275573 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-857bd77984-4wnsb" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.288619 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b55764b68-l794s" podStartSLOduration=2.2886027860000002 podStartE2EDuration="2.288602786s" podCreationTimestamp="2026-03-10 09:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:13.264025551 +0000 UTC m=+1179.518923430" watchObservedRunningTime="2026-03-10 09:23:13.288602786 +0000 UTC m=+1179.543500674" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.320213 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-76f6589d69-9q47v"] Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.320437 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-76f6589d69-9q47v" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker-log" containerID="cri-o://b1ec206adf1217262c3bfee94f02b30c425dada138bf01911ffc99b632986ffe" gracePeriod=30 Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.320872 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-76f6589d69-9q47v" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker" containerID="cri-o://8695ad4b4edf6d710e9f86e52079c0ebcb1312875a2b7956f23e1e79e035e665" gracePeriod=30 Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.326955 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.332709 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.336313 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" podStartSLOduration=2.336291101 podStartE2EDuration="2.336291101s" podCreationTimestamp="2026-03-10 09:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:13.291095737 +0000 UTC m=+1179.545993626" watchObservedRunningTime="2026-03-10 09:23:13.336291101 +0000 UTC m=+1179.591188991" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.353198 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5d58859d7d-76mv6"] Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.266463 4883 generic.go:334] "Generic (PLEG): container finished" podID="78bfcd03-74e4-4238-ae81-043bc04105cd" containerID="447fb8e21bbb8bc037b2913c14d7e94664156b41e117d80c9e5a7c41f589d745" exitCode=0 Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.266725 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v9wqz" event={"ID":"78bfcd03-74e4-4238-ae81-043bc04105cd","Type":"ContainerDied","Data":"447fb8e21bbb8bc037b2913c14d7e94664156b41e117d80c9e5a7c41f589d745"} Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.275134 4883 generic.go:334] "Generic (PLEG): container finished" podID="13211306-6813-40e0-91c2-5f8ac43968c6" containerID="8695ad4b4edf6d710e9f86e52079c0ebcb1312875a2b7956f23e1e79e035e665" exitCode=0 Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.275176 4883 generic.go:334] "Generic (PLEG): container finished" podID="13211306-6813-40e0-91c2-5f8ac43968c6" containerID="b1ec206adf1217262c3bfee94f02b30c425dada138bf01911ffc99b632986ffe" exitCode=143 Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.275194 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76f6589d69-9q47v" event={"ID":"13211306-6813-40e0-91c2-5f8ac43968c6","Type":"ContainerDied","Data":"8695ad4b4edf6d710e9f86e52079c0ebcb1312875a2b7956f23e1e79e035e665"} Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.275244 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76f6589d69-9q47v" event={"ID":"13211306-6813-40e0-91c2-5f8ac43968c6","Type":"ContainerDied","Data":"b1ec206adf1217262c3bfee94f02b30c425dada138bf01911ffc99b632986ffe"} Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.281732 4883 generic.go:334] "Generic (PLEG): container finished" podID="627e20df-c100-4dac-b344-efda8eda195a" containerID="4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1" exitCode=143 Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.284440 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-857bd77984-4wnsb" event={"ID":"627e20df-c100-4dac-b344-efda8eda195a","Type":"ContainerDied","Data":"4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1"} Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.286093 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener-log" containerID="cri-o://f3498d2162eafd25a3496fdf4e5a77ae8f5d492b8653875841a51cf1fa9f94b7" gracePeriod=30 Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.286394 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener" containerID="cri-o://d20e2851ee81ff0d94c886dd8f844a7b533eac90fb46cb8579dbbc2d2478bfbb" gracePeriod=30 Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.286527 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.291098 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.291117 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.291129 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.752697 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.831688 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-6pthc"] Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.831955 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="dnsmasq-dns" containerID="cri-o://368d8e92de1f4ea2c761213a4a664cf7368f309a09685b643b23b733208da7ea" gracePeriod=10 Mar 10 09:23:15 crc kubenswrapper[4883]: I0310 09:23:15.297539 4883 generic.go:334] "Generic (PLEG): container finished" podID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerID="d20e2851ee81ff0d94c886dd8f844a7b533eac90fb46cb8579dbbc2d2478bfbb" exitCode=0 Mar 10 09:23:15 crc kubenswrapper[4883]: I0310 09:23:15.297965 4883 generic.go:334] "Generic (PLEG): container finished" podID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerID="f3498d2162eafd25a3496fdf4e5a77ae8f5d492b8653875841a51cf1fa9f94b7" exitCode=143 Mar 10 09:23:15 crc kubenswrapper[4883]: I0310 09:23:15.297742 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" event={"ID":"329b0d09-6aa9-427f-8b4b-209bb2c6707b","Type":"ContainerDied","Data":"d20e2851ee81ff0d94c886dd8f844a7b533eac90fb46cb8579dbbc2d2478bfbb"} Mar 10 09:23:15 crc kubenswrapper[4883]: I0310 09:23:15.298100 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" event={"ID":"329b0d09-6aa9-427f-8b4b-209bb2c6707b","Type":"ContainerDied","Data":"f3498d2162eafd25a3496fdf4e5a77ae8f5d492b8653875841a51cf1fa9f94b7"} Mar 10 09:23:15 crc kubenswrapper[4883]: I0310 09:23:15.301764 4883 generic.go:334] "Generic (PLEG): container finished" podID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerID="368d8e92de1f4ea2c761213a4a664cf7368f309a09685b643b23b733208da7ea" exitCode=0 Mar 10 09:23:15 crc kubenswrapper[4883]: I0310 09:23:15.303113 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" event={"ID":"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499","Type":"ContainerDied","Data":"368d8e92de1f4ea2c761213a4a664cf7368f309a09685b643b23b733208da7ea"} Mar 10 09:23:16 crc kubenswrapper[4883]: I0310 09:23:16.219343 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 09:23:16 crc kubenswrapper[4883]: I0310 09:23:16.223554 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 09:23:17 crc kubenswrapper[4883]: I0310 09:23:17.449494 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:23:17 crc kubenswrapper[4883]: I0310 09:23:17.449831 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.106630 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.281088 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: connect: connection refused" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.415346 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.478271 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v9wqz" Mar 10 09:23:18 crc kubenswrapper[4883]: E0310 09:23:18.634696 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.649572 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6975\" (UniqueName: \"kubernetes.io/projected/78bfcd03-74e4-4238-ae81-043bc04105cd-kube-api-access-x6975\") pod \"78bfcd03-74e4-4238-ae81-043bc04105cd\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.649688 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-scripts\") pod \"78bfcd03-74e4-4238-ae81-043bc04105cd\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.651585 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-config-data\") pod \"78bfcd03-74e4-4238-ae81-043bc04105cd\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.651721 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78bfcd03-74e4-4238-ae81-043bc04105cd-logs\") pod \"78bfcd03-74e4-4238-ae81-043bc04105cd\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.651821 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-combined-ca-bundle\") pod \"78bfcd03-74e4-4238-ae81-043bc04105cd\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.652704 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78bfcd03-74e4-4238-ae81-043bc04105cd-logs" (OuterVolumeSpecName: "logs") pod "78bfcd03-74e4-4238-ae81-043bc04105cd" (UID: "78bfcd03-74e4-4238-ae81-043bc04105cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.652979 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78bfcd03-74e4-4238-ae81-043bc04105cd-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.658523 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78bfcd03-74e4-4238-ae81-043bc04105cd-kube-api-access-x6975" (OuterVolumeSpecName: "kube-api-access-x6975") pod "78bfcd03-74e4-4238-ae81-043bc04105cd" (UID: "78bfcd03-74e4-4238-ae81-043bc04105cd"). InnerVolumeSpecName "kube-api-access-x6975". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.659365 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-scripts" (OuterVolumeSpecName: "scripts") pod "78bfcd03-74e4-4238-ae81-043bc04105cd" (UID: "78bfcd03-74e4-4238-ae81-043bc04105cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.673918 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-857bd77984-4wnsb" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:60948->10.217.0.161:9311: read: connection reset by peer" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.674200 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-857bd77984-4wnsb" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:60940->10.217.0.161:9311: read: connection reset by peer" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.684638 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-config-data" (OuterVolumeSpecName: "config-data") pod "78bfcd03-74e4-4238-ae81-043bc04105cd" (UID: "78bfcd03-74e4-4238-ae81-043bc04105cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.756895 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6975\" (UniqueName: \"kubernetes.io/projected/78bfcd03-74e4-4238-ae81-043bc04105cd-kube-api-access-x6975\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.757146 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.757157 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.762673 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78bfcd03-74e4-4238-ae81-043bc04105cd" (UID: "78bfcd03-74e4-4238-ae81-043bc04105cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.768696 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.856414 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.856847 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.878977 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-svc\") pod \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.879147 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13211306-6813-40e0-91c2-5f8ac43968c6-logs\") pod \"13211306-6813-40e0-91c2-5f8ac43968c6\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.879230 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-combined-ca-bundle\") pod \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.879261 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj2sl\" (UniqueName: \"kubernetes.io/projected/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-kube-api-access-zj2sl\") pod \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.879298 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjg75\" (UniqueName: \"kubernetes.io/projected/13211306-6813-40e0-91c2-5f8ac43968c6-kube-api-access-jjg75\") pod \"13211306-6813-40e0-91c2-5f8ac43968c6\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.880133 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.881223 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13211306-6813-40e0-91c2-5f8ac43968c6-logs" (OuterVolumeSpecName: "logs") pod "13211306-6813-40e0-91c2-5f8ac43968c6" (UID: "13211306-6813-40e0-91c2-5f8ac43968c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.884723 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-kube-api-access-zj2sl" (OuterVolumeSpecName: "kube-api-access-zj2sl") pod "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" (UID: "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499"). InnerVolumeSpecName "kube-api-access-zj2sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.888681 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13211306-6813-40e0-91c2-5f8ac43968c6-kube-api-access-jjg75" (OuterVolumeSpecName: "kube-api-access-jjg75") pod "13211306-6813-40e0-91c2-5f8ac43968c6" (UID: "13211306-6813-40e0-91c2-5f8ac43968c6"). InnerVolumeSpecName "kube-api-access-jjg75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.914154 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "329b0d09-6aa9-427f-8b4b-209bb2c6707b" (UID: "329b0d09-6aa9-427f-8b4b-209bb2c6707b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.956945 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" (UID: "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.981342 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-sb\") pod \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.981389 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data\") pod \"13211306-6813-40e0-91c2-5f8ac43968c6\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982225 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-combined-ca-bundle\") pod \"13211306-6813-40e0-91c2-5f8ac43968c6\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982261 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-swift-storage-0\") pod \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982297 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data\") pod \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982315 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data-custom\") pod \"13211306-6813-40e0-91c2-5f8ac43968c6\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982334 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b0d09-6aa9-427f-8b4b-209bb2c6707b-logs\") pod \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982362 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbzvt\" (UniqueName: \"kubernetes.io/projected/329b0d09-6aa9-427f-8b4b-209bb2c6707b-kube-api-access-nbzvt\") pod \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982383 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-config\") pod \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982399 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-nb\") pod \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982418 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data-custom\") pod \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982878 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13211306-6813-40e0-91c2-5f8ac43968c6-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982891 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982903 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj2sl\" (UniqueName: \"kubernetes.io/projected/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-kube-api-access-zj2sl\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982913 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjg75\" (UniqueName: \"kubernetes.io/projected/13211306-6813-40e0-91c2-5f8ac43968c6-kube-api-access-jjg75\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982922 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.983996 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/329b0d09-6aa9-427f-8b4b-209bb2c6707b-logs" (OuterVolumeSpecName: "logs") pod "329b0d09-6aa9-427f-8b4b-209bb2c6707b" (UID: "329b0d09-6aa9-427f-8b4b-209bb2c6707b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.985929 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13211306-6813-40e0-91c2-5f8ac43968c6" (UID: "13211306-6813-40e0-91c2-5f8ac43968c6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.995808 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "329b0d09-6aa9-427f-8b4b-209bb2c6707b" (UID: "329b0d09-6aa9-427f-8b4b-209bb2c6707b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.995889 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329b0d09-6aa9-427f-8b4b-209bb2c6707b-kube-api-access-nbzvt" (OuterVolumeSpecName: "kube-api-access-nbzvt") pod "329b0d09-6aa9-427f-8b4b-209bb2c6707b" (UID: "329b0d09-6aa9-427f-8b4b-209bb2c6707b"). InnerVolumeSpecName "kube-api-access-nbzvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.084746 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13211306-6813-40e0-91c2-5f8ac43968c6" (UID: "13211306-6813-40e0-91c2-5f8ac43968c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.085279 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" (UID: "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.087290 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.087322 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.087332 4883 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.087344 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b0d09-6aa9-427f-8b4b-209bb2c6707b-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.087356 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbzvt\" (UniqueName: \"kubernetes.io/projected/329b0d09-6aa9-427f-8b4b-209bb2c6707b-kube-api-access-nbzvt\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.087365 4883 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.109985 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" (UID: "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.113836 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" (UID: "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.117572 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data" (OuterVolumeSpecName: "config-data") pod "329b0d09-6aa9-427f-8b4b-209bb2c6707b" (UID: "329b0d09-6aa9-427f-8b4b-209bb2c6707b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.121257 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data" (OuterVolumeSpecName: "config-data") pod "13211306-6813-40e0-91c2-5f8ac43968c6" (UID: "13211306-6813-40e0-91c2-5f8ac43968c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.146418 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-config" (OuterVolumeSpecName: "config") pod "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" (UID: "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.190616 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.190645 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.190659 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.190670 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.190678 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.199833 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.296092 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data-custom\") pod \"627e20df-c100-4dac-b344-efda8eda195a\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.296161 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4rhj\" (UniqueName: \"kubernetes.io/projected/627e20df-c100-4dac-b344-efda8eda195a-kube-api-access-h4rhj\") pod \"627e20df-c100-4dac-b344-efda8eda195a\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.296227 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-combined-ca-bundle\") pod \"627e20df-c100-4dac-b344-efda8eda195a\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.296284 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data\") pod \"627e20df-c100-4dac-b344-efda8eda195a\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.301402 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "627e20df-c100-4dac-b344-efda8eda195a" (UID: "627e20df-c100-4dac-b344-efda8eda195a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.312612 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627e20df-c100-4dac-b344-efda8eda195a-kube-api-access-h4rhj" (OuterVolumeSpecName: "kube-api-access-h4rhj") pod "627e20df-c100-4dac-b344-efda8eda195a" (UID: "627e20df-c100-4dac-b344-efda8eda195a"). InnerVolumeSpecName "kube-api-access-h4rhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.336824 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "627e20df-c100-4dac-b344-efda8eda195a" (UID: "627e20df-c100-4dac-b344-efda8eda195a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.342688 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data" (OuterVolumeSpecName: "config-data") pod "627e20df-c100-4dac-b344-efda8eda195a" (UID: "627e20df-c100-4dac-b344-efda8eda195a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.377810 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v9wqz" event={"ID":"78bfcd03-74e4-4238-ae81-043bc04105cd","Type":"ContainerDied","Data":"a9fc6acc617749c8e1de867b10f66ead08446875d606f1370db6c642ea9067e0"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.377839 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v9wqz" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.377860 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9fc6acc617749c8e1de867b10f66ead08446875d606f1370db6c642ea9067e0" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.379494 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x2hf5" event={"ID":"dc0b1d9d-7834-473a-a487-6f540c606706","Type":"ContainerStarted","Data":"0058e7bce226cca970138ded52e080821de3de3f9381dfb13e4eee1dea226cbf"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.382141 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.382129 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76f6589d69-9q47v" event={"ID":"13211306-6813-40e0-91c2-5f8ac43968c6","Type":"ContainerDied","Data":"840eb0e983eac3e58e23b21ee91762a03097be568c06b5951bfe32f90ffa8f08"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.382278 4883 scope.go:117] "RemoveContainer" containerID="8695ad4b4edf6d710e9f86e52079c0ebcb1312875a2b7956f23e1e79e035e665" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.390960 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" event={"ID":"329b0d09-6aa9-427f-8b4b-209bb2c6707b","Type":"ContainerDied","Data":"e6c22b5b53507d702e074cf63768ed9e237d5a5c16f54e7d10767483ac8e989f"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.391091 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.395781 4883 generic.go:334] "Generic (PLEG): container finished" podID="627e20df-c100-4dac-b344-efda8eda195a" containerID="7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c" exitCode=0 Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.395836 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-857bd77984-4wnsb" event={"ID":"627e20df-c100-4dac-b344-efda8eda195a","Type":"ContainerDied","Data":"7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.395857 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-857bd77984-4wnsb" event={"ID":"627e20df-c100-4dac-b344-efda8eda195a","Type":"ContainerDied","Data":"c8cc80979c9fb422fb6df0d72e23560faaf12e9f1fbf0fcaea70b7f181ea66ed"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.395899 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.397771 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627e20df-c100-4dac-b344-efda8eda195a-logs\") pod \"627e20df-c100-4dac-b344-efda8eda195a\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.398981 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4rhj\" (UniqueName: \"kubernetes.io/projected/627e20df-c100-4dac-b344-efda8eda195a-kube-api-access-h4rhj\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.399001 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.399011 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.399021 4883 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.404759 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627e20df-c100-4dac-b344-efda8eda195a-logs" (OuterVolumeSpecName: "logs") pod "627e20df-c100-4dac-b344-efda8eda195a" (UID: "627e20df-c100-4dac-b344-efda8eda195a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.406291 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" event={"ID":"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499","Type":"ContainerDied","Data":"597e041b53a7dcfb1def658755f45ca307eb7a79b514a35cb1ad87244e150850"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.406425 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.409062 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerStarted","Data":"82ac9cef449a166509d1caa350f74b5be107da82b10ce98d86e5e716e762ff21"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.409198 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="ceilometer-notification-agent" containerID="cri-o://859a5bbeb1589e12771278a11c1722a6122ae18255ec7752da9e54dc359a7b36" gracePeriod=30 Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.409414 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.409467 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="proxy-httpd" containerID="cri-o://82ac9cef449a166509d1caa350f74b5be107da82b10ce98d86e5e716e762ff21" gracePeriod=30 Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.409530 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="sg-core" containerID="cri-o://68f801eebbf66b42a914bc4f687373317ee14f9ee6dba99ea4dc4fb9a1f0310b" gracePeriod=30 Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.428963 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-x2hf5" podStartSLOduration=2.666803312 podStartE2EDuration="47.428934081s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="2026-03-10 09:22:33.623041735 +0000 UTC m=+1139.877939624" lastFinishedPulling="2026-03-10 09:23:18.385172503 +0000 UTC m=+1184.640070393" observedRunningTime="2026-03-10 09:23:19.404133996 +0000 UTC m=+1185.659031876" watchObservedRunningTime="2026-03-10 09:23:19.428934081 +0000 UTC m=+1185.683831970" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.436972 4883 scope.go:117] "RemoveContainer" containerID="b1ec206adf1217262c3bfee94f02b30c425dada138bf01911ffc99b632986ffe" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.466529 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5d58859d7d-76mv6"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.466752 4883 scope.go:117] "RemoveContainer" containerID="d20e2851ee81ff0d94c886dd8f844a7b533eac90fb46cb8579dbbc2d2478bfbb" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.486092 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5d58859d7d-76mv6"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.490911 4883 scope.go:117] "RemoveContainer" containerID="f3498d2162eafd25a3496fdf4e5a77ae8f5d492b8653875841a51cf1fa9f94b7" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.495744 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-76f6589d69-9q47v"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.500611 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627e20df-c100-4dac-b344-efda8eda195a-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.501290 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-76f6589d69-9q47v"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.519352 4883 scope.go:117] "RemoveContainer" containerID="7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.544332 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-6pthc"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.566242 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-6pthc"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603347 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5946656968-5mzlm"] Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603874 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker-log" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603888 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker-log" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603907 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="dnsmasq-dns" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603913 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="dnsmasq-dns" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603922 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener-log" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603929 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener-log" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603940 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603946 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603957 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api-log" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603963 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api-log" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603978 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603984 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603995 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="init" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604001 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="init" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.604013 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604019 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.604032 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78bfcd03-74e4-4238-ae81-043bc04105cd" containerName="placement-db-sync" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604038 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="78bfcd03-74e4-4238-ae81-043bc04105cd" containerName="placement-db-sync" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604387 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker-log" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604415 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="dnsmasq-dns" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604425 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener-log" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604435 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604447 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604465 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604491 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="78bfcd03-74e4-4238-ae81-043bc04105cd" containerName="placement-db-sync" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604500 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api-log" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.607297 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.609850 4883 scope.go:117] "RemoveContainer" containerID="4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.613780 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.614779 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4s72j" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.616249 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.616443 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5946656968-5mzlm"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.622377 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.624960 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.680224 4883 scope.go:117] "RemoveContainer" containerID="7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.680777 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c\": container with ID starting with 7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c not found: ID does not exist" containerID="7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.680818 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c"} err="failed to get container status \"7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c\": rpc error: code = NotFound desc = could not find container \"7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c\": container with ID starting with 7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c not found: ID does not exist" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.680843 4883 scope.go:117] "RemoveContainer" containerID="4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.681090 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1\": container with ID starting with 4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1 not found: ID does not exist" containerID="4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.681125 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1"} err="failed to get container status \"4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1\": rpc error: code = NotFound desc = could not find container \"4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1\": container with ID starting with 4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1 not found: ID does not exist" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.681146 4883 scope.go:117] "RemoveContainer" containerID="368d8e92de1f4ea2c761213a4a664cf7368f309a09685b643b23b733208da7ea" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.701334 4883 scope.go:117] "RemoveContainer" containerID="e9f3a5dd63ead621338a9597451694f38d0bb7781f2624ce640a6d0a6dc7e4a2" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712576 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-config-data\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712660 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-scripts\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712686 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-public-tls-certs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712730 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-combined-ca-bundle\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712847 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-internal-tls-certs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712877 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309c3af5-db30-48b8-8118-471950b7312c-logs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712926 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wczn\" (UniqueName: \"kubernetes.io/projected/309c3af5-db30-48b8-8118-471950b7312c-kube-api-access-4wczn\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.746181 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-857bd77984-4wnsb"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.753811 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-857bd77984-4wnsb"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815500 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-combined-ca-bundle\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815601 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-internal-tls-certs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815641 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309c3af5-db30-48b8-8118-471950b7312c-logs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815703 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wczn\" (UniqueName: \"kubernetes.io/projected/309c3af5-db30-48b8-8118-471950b7312c-kube-api-access-4wczn\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815781 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-config-data\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815822 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-scripts\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815845 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-public-tls-certs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.816907 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309c3af5-db30-48b8-8118-471950b7312c-logs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.824669 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-scripts\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.824758 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-config-data\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.825152 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-internal-tls-certs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.827741 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-combined-ca-bundle\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.828999 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-public-tls-certs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.835374 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wczn\" (UniqueName: \"kubernetes.io/projected/309c3af5-db30-48b8-8118-471950b7312c-kube-api-access-4wczn\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.962910 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.095985 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" path="/var/lib/kubelet/pods/13211306-6813-40e0-91c2-5f8ac43968c6/volumes" Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.096921 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" path="/var/lib/kubelet/pods/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499/volumes" Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.097813 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" path="/var/lib/kubelet/pods/329b0d09-6aa9-427f-8b4b-209bb2c6707b/volumes" Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.099078 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="627e20df-c100-4dac-b344-efda8eda195a" path="/var/lib/kubelet/pods/627e20df-c100-4dac-b344-efda8eda195a/volumes" Mar 10 09:23:20 crc kubenswrapper[4883]: W0310 09:23:20.391697 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod309c3af5_db30_48b8_8118_471950b7312c.slice/crio-b61d7b6065c243f6373050e2a8de83e69a1990dc6e4913c4f3260a337aeab001 WatchSource:0}: Error finding container b61d7b6065c243f6373050e2a8de83e69a1990dc6e4913c4f3260a337aeab001: Status 404 returned error can't find the container with id b61d7b6065c243f6373050e2a8de83e69a1990dc6e4913c4f3260a337aeab001 Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.393150 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5946656968-5mzlm"] Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.443414 4883 generic.go:334] "Generic (PLEG): container finished" podID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerID="82ac9cef449a166509d1caa350f74b5be107da82b10ce98d86e5e716e762ff21" exitCode=0 Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.443446 4883 generic.go:334] "Generic (PLEG): container finished" podID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerID="68f801eebbf66b42a914bc4f687373317ee14f9ee6dba99ea4dc4fb9a1f0310b" exitCode=2 Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.443519 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerDied","Data":"82ac9cef449a166509d1caa350f74b5be107da82b10ce98d86e5e716e762ff21"} Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.443570 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerDied","Data":"68f801eebbf66b42a914bc4f687373317ee14f9ee6dba99ea4dc4fb9a1f0310b"} Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.444798 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5946656968-5mzlm" event={"ID":"309c3af5-db30-48b8-8118-471950b7312c","Type":"ContainerStarted","Data":"b61d7b6065c243f6373050e2a8de83e69a1990dc6e4913c4f3260a337aeab001"} Mar 10 09:23:21 crc kubenswrapper[4883]: I0310 09:23:21.459463 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5946656968-5mzlm" event={"ID":"309c3af5-db30-48b8-8118-471950b7312c","Type":"ContainerStarted","Data":"e79b3bbae1ef491e3eb1cf5d5527c1a8f3f25ca7b001953b719564deb9bcc0f4"} Mar 10 09:23:21 crc kubenswrapper[4883]: I0310 09:23:21.459796 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5946656968-5mzlm" event={"ID":"309c3af5-db30-48b8-8118-471950b7312c","Type":"ContainerStarted","Data":"49ea9b54ba2a9c9f6832dee8075a00229502735ff96263c0039328cb99965844"} Mar 10 09:23:21 crc kubenswrapper[4883]: I0310 09:23:21.460085 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:21 crc kubenswrapper[4883]: I0310 09:23:21.460109 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:21 crc kubenswrapper[4883]: I0310 09:23:21.489243 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5946656968-5mzlm" podStartSLOduration=2.489224986 podStartE2EDuration="2.489224986s" podCreationTimestamp="2026-03-10 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:21.48016418 +0000 UTC m=+1187.735062069" watchObservedRunningTime="2026-03-10 09:23:21.489224986 +0000 UTC m=+1187.744122875" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.471189 4883 generic.go:334] "Generic (PLEG): container finished" podID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerID="859a5bbeb1589e12771278a11c1722a6122ae18255ec7752da9e54dc359a7b36" exitCode=0 Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.471267 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerDied","Data":"859a5bbeb1589e12771278a11c1722a6122ae18255ec7752da9e54dc359a7b36"} Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.473359 4883 generic.go:334] "Generic (PLEG): container finished" podID="dc0b1d9d-7834-473a-a487-6f540c606706" containerID="0058e7bce226cca970138ded52e080821de3de3f9381dfb13e4eee1dea226cbf" exitCode=0 Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.473466 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x2hf5" event={"ID":"dc0b1d9d-7834-473a-a487-6f540c606706","Type":"ContainerDied","Data":"0058e7bce226cca970138ded52e080821de3de3f9381dfb13e4eee1dea226cbf"} Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.772087 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.788469 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-config-data\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.788577 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-combined-ca-bundle\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.788630 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-run-httpd\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.788701 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmmc9\" (UniqueName: \"kubernetes.io/projected/87232af6-dc87-4f68-8b1f-850fd98219a8-kube-api-access-wmmc9\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.788911 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-sg-core-conf-yaml\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.788975 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-scripts\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.789005 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-log-httpd\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.790229 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.794444 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.820637 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-scripts" (OuterVolumeSpecName: "scripts") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.824222 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87232af6-dc87-4f68-8b1f-850fd98219a8-kube-api-access-wmmc9" (OuterVolumeSpecName: "kube-api-access-wmmc9") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "kube-api-access-wmmc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.836828 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.849465 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.855739 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-config-data" (OuterVolumeSpecName: "config-data") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.890933 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892300 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892330 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892341 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892351 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892361 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892369 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892378 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmmc9\" (UniqueName: \"kubernetes.io/projected/87232af6-dc87-4f68-8b1f-850fd98219a8-kube-api-access-wmmc9\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.994936 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.099574 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.145889 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.218121 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-b5b6f75db-kgz55"] Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.218567 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-b5b6f75db-kgz55" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api-log" containerID="cri-o://f44137001e9de9273d72347a19e670685324d63a7ae12a0a99d5b0aa11da1c3a" gracePeriod=30 Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.218685 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-b5b6f75db-kgz55" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api" containerID="cri-o://7bfade1d342fc1fec145c040e3bb1b0fda95a95f92ce0479cba528ed21f45abc" gracePeriod=30 Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.486064 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerDied","Data":"340a626f298a9ef4ea8b124b00075b5b0663514ede9873d187c8ba97a25a89bf"} Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.486135 4883 scope.go:117] "RemoveContainer" containerID="82ac9cef449a166509d1caa350f74b5be107da82b10ce98d86e5e716e762ff21" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.486129 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.489441 4883 generic.go:334] "Generic (PLEG): container finished" podID="41ac116f-f773-4b3f-a508-bc304668da18" containerID="f44137001e9de9273d72347a19e670685324d63a7ae12a0a99d5b0aa11da1c3a" exitCode=143 Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.490674 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b5b6f75db-kgz55" event={"ID":"41ac116f-f773-4b3f-a508-bc304668da18","Type":"ContainerDied","Data":"f44137001e9de9273d72347a19e670685324d63a7ae12a0a99d5b0aa11da1c3a"} Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.513291 4883 scope.go:117] "RemoveContainer" containerID="68f801eebbf66b42a914bc4f687373317ee14f9ee6dba99ea4dc4fb9a1f0310b" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.540373 4883 scope.go:117] "RemoveContainer" containerID="859a5bbeb1589e12771278a11c1722a6122ae18255ec7752da9e54dc359a7b36" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.556798 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.566160 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.576748 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:23 crc kubenswrapper[4883]: E0310 09:23:23.577130 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="ceilometer-notification-agent" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.577148 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="ceilometer-notification-agent" Mar 10 09:23:23 crc kubenswrapper[4883]: E0310 09:23:23.577172 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="proxy-httpd" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.577179 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="proxy-httpd" Mar 10 09:23:23 crc kubenswrapper[4883]: E0310 09:23:23.577193 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="sg-core" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.577200 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="sg-core" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.577359 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="sg-core" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.577373 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="ceilometer-notification-agent" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.577381 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="proxy-httpd" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.578843 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.581622 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.581823 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.582895 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605106 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-run-httpd\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605145 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605249 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-config-data\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605303 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghhwq\" (UniqueName: \"kubernetes.io/projected/6c96038a-8a42-4863-a89f-5076c24da12e-kube-api-access-ghhwq\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605320 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-log-httpd\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605379 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-scripts\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605671 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.693155 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:23 crc kubenswrapper[4883]: E0310 09:23:23.694139 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-ghhwq log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="6c96038a-8a42-4863-a89f-5076c24da12e" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.707742 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.707820 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-run-httpd\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.707844 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.707913 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-config-data\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.707953 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghhwq\" (UniqueName: \"kubernetes.io/projected/6c96038a-8a42-4863-a89f-5076c24da12e-kube-api-access-ghhwq\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.707973 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-log-httpd\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.708022 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-scripts\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.708603 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-run-httpd\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.709203 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-log-httpd\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.719765 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-scripts\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.720530 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.721191 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-config-data\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.721677 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.724735 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghhwq\" (UniqueName: \"kubernetes.io/projected/6c96038a-8a42-4863-a89f-5076c24da12e-kube-api-access-ghhwq\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.811173 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.913301 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lhjm\" (UniqueName: \"kubernetes.io/projected/dc0b1d9d-7834-473a-a487-6f540c606706-kube-api-access-6lhjm\") pod \"dc0b1d9d-7834-473a-a487-6f540c606706\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.913347 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-scripts\") pod \"dc0b1d9d-7834-473a-a487-6f540c606706\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.913386 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0b1d9d-7834-473a-a487-6f540c606706-etc-machine-id\") pod \"dc0b1d9d-7834-473a-a487-6f540c606706\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.913515 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-combined-ca-bundle\") pod \"dc0b1d9d-7834-473a-a487-6f540c606706\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.913644 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-config-data\") pod \"dc0b1d9d-7834-473a-a487-6f540c606706\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.913796 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-db-sync-config-data\") pod \"dc0b1d9d-7834-473a-a487-6f540c606706\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.914520 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc0b1d9d-7834-473a-a487-6f540c606706-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dc0b1d9d-7834-473a-a487-6f540c606706" (UID: "dc0b1d9d-7834-473a-a487-6f540c606706"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.922552 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dc0b1d9d-7834-473a-a487-6f540c606706" (UID: "dc0b1d9d-7834-473a-a487-6f540c606706"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.932088 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-scripts" (OuterVolumeSpecName: "scripts") pod "dc0b1d9d-7834-473a-a487-6f540c606706" (UID: "dc0b1d9d-7834-473a-a487-6f540c606706"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.932201 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0b1d9d-7834-473a-a487-6f540c606706-kube-api-access-6lhjm" (OuterVolumeSpecName: "kube-api-access-6lhjm") pod "dc0b1d9d-7834-473a-a487-6f540c606706" (UID: "dc0b1d9d-7834-473a-a487-6f540c606706"). InnerVolumeSpecName "kube-api-access-6lhjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.935112 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc0b1d9d-7834-473a-a487-6f540c606706" (UID: "dc0b1d9d-7834-473a-a487-6f540c606706"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.953838 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-config-data" (OuterVolumeSpecName: "config-data") pod "dc0b1d9d-7834-473a-a487-6f540c606706" (UID: "dc0b1d9d-7834-473a-a487-6f540c606706"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.016072 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.016101 4883 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.016123 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lhjm\" (UniqueName: \"kubernetes.io/projected/dc0b1d9d-7834-473a-a487-6f540c606706-kube-api-access-6lhjm\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.016137 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.016146 4883 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0b1d9d-7834-473a-a487-6f540c606706-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.016155 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.089789 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" path="/var/lib/kubelet/pods/87232af6-dc87-4f68-8b1f-850fd98219a8/volumes" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.504797 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.504945 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.505887 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x2hf5" event={"ID":"dc0b1d9d-7834-473a-a487-6f540c606706","Type":"ContainerDied","Data":"bd6a779566a1daa68a4f86119c6780e52d4493f0681838af48cbb4dbd90f52cc"} Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.505966 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd6a779566a1daa68a4f86119c6780e52d4493f0681838af48cbb4dbd90f52cc" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.539080 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.550041 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.552798 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.677143 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fcc9bbb48-lf4jb"] Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.755612 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghhwq\" (UniqueName: \"kubernetes.io/projected/6c96038a-8a42-4863-a89f-5076c24da12e-kube-api-access-ghhwq\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.755758 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-config-data\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.755836 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-combined-ca-bundle\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.755867 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-sg-core-conf-yaml\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.755908 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-scripts\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.756013 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-log-httpd\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.756098 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-run-httpd\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.765822 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.766509 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.767768 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.768652 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c96038a-8a42-4863-a89f-5076c24da12e-kube-api-access-ghhwq" (OuterVolumeSpecName: "kube-api-access-ghhwq") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "kube-api-access-ghhwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.778602 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-scripts" (OuterVolumeSpecName: "scripts") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.787946 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-config-data" (OuterVolumeSpecName: "config-data") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.789093 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.795103 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:24 crc kubenswrapper[4883]: E0310 09:23:24.795436 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0b1d9d-7834-473a-a487-6f540c606706" containerName="cinder-db-sync" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.795453 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0b1d9d-7834-473a-a487-6f540c606706" containerName="cinder-db-sync" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.795649 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc0b1d9d-7834-473a-a487-6f540c606706" containerName="cinder-db-sync" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.796464 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.805573 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.805655 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-prwrq" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.805774 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.805889 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.812546 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859041 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghhwq\" (UniqueName: \"kubernetes.io/projected/6c96038a-8a42-4863-a89f-5076c24da12e-kube-api-access-ghhwq\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859069 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859080 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859095 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859104 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859112 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859120 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.878020 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-mnnl8"] Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.879755 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.902004 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-mnnl8"] Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.960665 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4sgf\" (UniqueName: \"kubernetes.io/projected/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-kube-api-access-d4sgf\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.960726 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.960814 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.960837 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-scripts\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.960858 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.960962 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.062498 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h2jc\" (UniqueName: \"kubernetes.io/projected/fb652436-cf46-4a91-b358-f6c6a011cf43-kube-api-access-9h2jc\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.062785 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.062905 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063015 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063101 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4sgf\" (UniqueName: \"kubernetes.io/projected/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-kube-api-access-d4sgf\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063177 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063298 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-config\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063377 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063454 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-scripts\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063546 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063651 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063762 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.065192 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.070041 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.073934 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-scripts\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.077623 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.091435 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.100412 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4sgf\" (UniqueName: \"kubernetes.io/projected/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-kube-api-access-d4sgf\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.111690 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.113313 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.117289 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.124682 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.164185 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.165788 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h2jc\" (UniqueName: \"kubernetes.io/projected/fb652436-cf46-4a91-b358-f6c6a011cf43-kube-api-access-9h2jc\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.165840 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.165887 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.165939 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.166015 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-config\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.166094 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.167817 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.171075 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-config\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.185553 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.194678 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.195641 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.204725 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h2jc\" (UniqueName: \"kubernetes.io/projected/fb652436-cf46-4a91-b358-f6c6a011cf43-kube-api-access-9h2jc\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.222663 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.268618 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4853248e-5cd4-4cf3-b9e7-b824fad23efe-logs\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.268933 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22fbf\" (UniqueName: \"kubernetes.io/projected/4853248e-5cd4-4cf3-b9e7-b824fad23efe-kube-api-access-22fbf\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.269043 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4853248e-5cd4-4cf3-b9e7-b824fad23efe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.269181 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.269300 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-scripts\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.269497 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data-custom\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.270408 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.373729 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4853248e-5cd4-4cf3-b9e7-b824fad23efe-logs\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374032 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22fbf\" (UniqueName: \"kubernetes.io/projected/4853248e-5cd4-4cf3-b9e7-b824fad23efe-kube-api-access-22fbf\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374052 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4853248e-5cd4-4cf3-b9e7-b824fad23efe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374096 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374301 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-scripts\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374383 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4853248e-5cd4-4cf3-b9e7-b824fad23efe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374484 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data-custom\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374573 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374718 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4853248e-5cd4-4cf3-b9e7-b824fad23efe-logs\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.380074 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data-custom\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.384401 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.384830 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-scripts\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.388843 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.391171 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22fbf\" (UniqueName: \"kubernetes.io/projected/4853248e-5cd4-4cf3-b9e7-b824fad23efe-kube-api-access-22fbf\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.500666 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.512394 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.512539 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fcc9bbb48-lf4jb" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon-log" containerID="cri-o://5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff" gracePeriod=30 Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.512825 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fcc9bbb48-lf4jb" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" containerID="cri-o://ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317" gracePeriod=30 Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.522001 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.612201 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bc48b486f-2j899"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.612509 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bc48b486f-2j899" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-api" containerID="cri-o://3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49" gracePeriod=30 Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.612959 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bc48b486f-2j899" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-httpd" containerID="cri-o://6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a" gracePeriod=30 Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.655266 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5bc48b486f-2j899" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": EOF" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.674027 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.687928 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.707568 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7b5fb6fc5c-pj985"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.709385 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.720239 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b5fb6fc5c-pj985"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.730643 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.733042 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.736361 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.737573 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.751075 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.762500 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.885760 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.885809 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.885842 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-ovndb-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.885880 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-run-httpd\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.885900 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-config-data\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.885935 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ss9n\" (UniqueName: \"kubernetes.io/projected/82fb8a17-1c35-415a-8a5d-478730286eb1-kube-api-access-9ss9n\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886315 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-internal-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886425 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-httpd-config\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886545 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-public-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886619 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-log-httpd\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886720 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-scripts\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886756 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-config\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886803 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xj82\" (UniqueName: \"kubernetes.io/projected/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-kube-api-access-4xj82\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886862 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-combined-ca-bundle\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.988979 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-config-data\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989029 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-run-httpd\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989084 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ss9n\" (UniqueName: \"kubernetes.io/projected/82fb8a17-1c35-415a-8a5d-478730286eb1-kube-api-access-9ss9n\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989212 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-internal-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989274 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-httpd-config\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989352 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-public-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989391 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-log-httpd\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989449 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-scripts\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989486 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-config\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989518 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xj82\" (UniqueName: \"kubernetes.io/projected/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-kube-api-access-4xj82\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989559 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-combined-ca-bundle\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989641 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989785 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-log-httpd\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.990466 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.990569 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-ovndb-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.990575 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-run-httpd\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.997060 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.997129 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.997719 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-scripts\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.998067 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-internal-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.998130 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-httpd-config\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.998230 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-config\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.998440 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-combined-ca-bundle\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.000814 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-ovndb-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.002836 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-config-data\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.004667 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xj82\" (UniqueName: \"kubernetes.io/projected/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-kube-api-access-4xj82\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.006240 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ss9n\" (UniqueName: \"kubernetes.io/projected/82fb8a17-1c35-415a-8a5d-478730286eb1-kube-api-access-9ss9n\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.008214 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-public-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.033195 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.095757 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.106375 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c96038a-8a42-4863-a89f-5076c24da12e" path="/var/lib/kubelet/pods/6c96038a-8a42-4863-a89f-5076c24da12e/volumes" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.107252 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-mnnl8"] Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.182395 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.522849 4883 generic.go:334] "Generic (PLEG): container finished" podID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerID="6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a" exitCode=0 Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.522907 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bc48b486f-2j899" event={"ID":"77895d16-8ad3-4edb-ae91-d807afd499b3","Type":"ContainerDied","Data":"6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a"} Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.526829 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4853248e-5cd4-4cf3-b9e7-b824fad23efe","Type":"ContainerStarted","Data":"987bc0cc38389c3b6190190ae5c16e3637e3022a5ee37c4c3bc24573be51664c"} Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.539178 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a4664a7-ad8d-44ab-8f7f-d621e6b01899","Type":"ContainerStarted","Data":"3b75442ea4aee0774586b8d43c5c36b051132ecdb5e3320c19060d614e90bf9c"} Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.547680 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.553850 4883 generic.go:334] "Generic (PLEG): container finished" podID="41ac116f-f773-4b3f-a508-bc304668da18" containerID="7bfade1d342fc1fec145c040e3bb1b0fda95a95f92ce0479cba528ed21f45abc" exitCode=0 Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.553908 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b5b6f75db-kgz55" event={"ID":"41ac116f-f773-4b3f-a508-bc304668da18","Type":"ContainerDied","Data":"7bfade1d342fc1fec145c040e3bb1b0fda95a95f92ce0479cba528ed21f45abc"} Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.556338 4883 generic.go:334] "Generic (PLEG): container finished" podID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerID="8dc17cfa98b4f413422ff6ec7b4debd0ca6ed29db8f51bb73e604fc0c8aedd72" exitCode=0 Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.556359 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" event={"ID":"fb652436-cf46-4a91-b358-f6c6a011cf43","Type":"ContainerDied","Data":"8dc17cfa98b4f413422ff6ec7b4debd0ca6ed29db8f51bb73e604fc0c8aedd72"} Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.556376 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" event={"ID":"fb652436-cf46-4a91-b358-f6c6a011cf43","Type":"ContainerStarted","Data":"f3dfd9c8abe53e2f4e70fd004e6457ef025d9a2c819617d0dfac05e54db79843"} Mar 10 09:23:26 crc kubenswrapper[4883]: W0310 09:23:26.565395 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa5bf577_e25f_4df2_b088_d1b667ea1d0e.slice/crio-b152cd34b224458fb1ee138c83877e4d16208f40c78470f8afbb79dee1fb3eca WatchSource:0}: Error finding container b152cd34b224458fb1ee138c83877e4d16208f40c78470f8afbb79dee1fb3eca: Status 404 returned error can't find the container with id b152cd34b224458fb1ee138c83877e4d16208f40c78470f8afbb79dee1fb3eca Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.690177 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b5fb6fc5c-pj985"] Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.759148 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919273 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-internal-tls-certs\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919442 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-public-tls-certs\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919491 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ac116f-f773-4b3f-a508-bc304668da18-logs\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919585 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-combined-ca-bundle\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919647 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919744 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data-custom\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919853 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gb9m\" (UniqueName: \"kubernetes.io/projected/41ac116f-f773-4b3f-a508-bc304668da18-kube-api-access-4gb9m\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.920166 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ac116f-f773-4b3f-a508-bc304668da18-logs" (OuterVolumeSpecName: "logs") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.920497 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ac116f-f773-4b3f-a508-bc304668da18-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.928252 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.929214 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ac116f-f773-4b3f-a508-bc304668da18-kube-api-access-4gb9m" (OuterVolumeSpecName: "kube-api-access-4gb9m") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "kube-api-access-4gb9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.943204 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5bc48b486f-2j899" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": dial tcp 10.217.0.156:9696: connect: connection refused" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.952872 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.959885 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.994081 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.004991 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.022986 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data" (OuterVolumeSpecName: "config-data") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023073 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:27 crc kubenswrapper[4883]: W0310 09:23:27.023305 4883 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/41ac116f-f773-4b3f-a508-bc304668da18/volumes/kubernetes.io~secret/config-data Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023320 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data" (OuterVolumeSpecName: "config-data") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023789 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gb9m\" (UniqueName: \"kubernetes.io/projected/41ac116f-f773-4b3f-a508-bc304668da18-kube-api-access-4gb9m\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023814 4883 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023825 4883 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023837 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023848 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023861 4883 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.579737 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" event={"ID":"fb652436-cf46-4a91-b358-f6c6a011cf43","Type":"ContainerStarted","Data":"cc94e4a4d546d543f44873ce6a1ec9593535c80494c566268aaa2c5b62a424b1"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.580217 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.584277 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b5fb6fc5c-pj985" event={"ID":"82fb8a17-1c35-415a-8a5d-478730286eb1","Type":"ContainerStarted","Data":"15fb2cc42b063a1215930cdffeb827d10df153ae99d5d786342702e21c8dc077"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.584306 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b5fb6fc5c-pj985" event={"ID":"82fb8a17-1c35-415a-8a5d-478730286eb1","Type":"ContainerStarted","Data":"048ce7d69969a131d91672c025bd83c18be75fbcfc3649fc4a94970ff74e3a77"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.584319 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b5fb6fc5c-pj985" event={"ID":"82fb8a17-1c35-415a-8a5d-478730286eb1","Type":"ContainerStarted","Data":"bd937ca1a6b53d82a1756c79df23dd1772ef704b29ba04e5d1cc890f7ab8778d"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.585552 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.599759 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" podStartSLOduration=3.59974666 podStartE2EDuration="3.59974666s" podCreationTimestamp="2026-03-10 09:23:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:27.59827764 +0000 UTC m=+1193.853175528" watchObservedRunningTime="2026-03-10 09:23:27.59974666 +0000 UTC m=+1193.854644549" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.609589 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4853248e-5cd4-4cf3-b9e7-b824fad23efe","Type":"ContainerStarted","Data":"4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.609643 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4853248e-5cd4-4cf3-b9e7-b824fad23efe","Type":"ContainerStarted","Data":"0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.609644 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api-log" containerID="cri-o://0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3" gracePeriod=30 Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.609702 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.609732 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api" containerID="cri-o://4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5" gracePeriod=30 Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.621349 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a4664a7-ad8d-44ab-8f7f-d621e6b01899","Type":"ContainerStarted","Data":"29e17f1955d24da31f1ef855d53d08193f96fe7c1aaf64e5d0776ebe40d1f3d7"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.622581 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7b5fb6fc5c-pj985" podStartSLOduration=2.622562604 podStartE2EDuration="2.622562604s" podCreationTimestamp="2026-03-10 09:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:27.615058012 +0000 UTC m=+1193.869955901" watchObservedRunningTime="2026-03-10 09:23:27.622562604 +0000 UTC m=+1193.877460493" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.625210 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerStarted","Data":"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.625253 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerStarted","Data":"b152cd34b224458fb1ee138c83877e4d16208f40c78470f8afbb79dee1fb3eca"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.629525 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b5b6f75db-kgz55" event={"ID":"41ac116f-f773-4b3f-a508-bc304668da18","Type":"ContainerDied","Data":"867dc81634d3cab6b7d62fe7b7bea4736f522c76e9c10f3a681cf74c0480a8db"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.629640 4883 scope.go:117] "RemoveContainer" containerID="7bfade1d342fc1fec145c040e3bb1b0fda95a95f92ce0479cba528ed21f45abc" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.629813 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.644256 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.644234109 podStartE2EDuration="2.644234109s" podCreationTimestamp="2026-03-10 09:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:27.634715811 +0000 UTC m=+1193.889613700" watchObservedRunningTime="2026-03-10 09:23:27.644234109 +0000 UTC m=+1193.899131999" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.663325 4883 scope.go:117] "RemoveContainer" containerID="f44137001e9de9273d72347a19e670685324d63a7ae12a0a99d5b0aa11da1c3a" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.668644 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-b5b6f75db-kgz55"] Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.674072 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-b5b6f75db-kgz55"] Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.095053 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ac116f-f773-4b3f-a508-bc304668da18" path="/var/lib/kubelet/pods/41ac116f-f773-4b3f-a508-bc304668da18/volumes" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.151980 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.152780 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-ovndb-tls-certs\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.152844 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-internal-tls-certs\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.152914 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-combined-ca-bundle\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.152931 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-public-tls-certs\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.224554 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.227779 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.244703 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.247552 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.255293 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-config\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.255458 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj9nk\" (UniqueName: \"kubernetes.io/projected/77895d16-8ad3-4edb-ae91-d807afd499b3-kube-api-access-xj9nk\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.255711 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-httpd-config\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.256934 4883 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.257045 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.257104 4883 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.257154 4883 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.258675 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77895d16-8ad3-4edb-ae91-d807afd499b3-kube-api-access-xj9nk" (OuterVolumeSpecName: "kube-api-access-xj9nk") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "kube-api-access-xj9nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.258821 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.290006 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-config" (OuterVolumeSpecName: "config") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.358889 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj9nk\" (UniqueName: \"kubernetes.io/projected/77895d16-8ad3-4edb-ae91-d807afd499b3-kube-api-access-xj9nk\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.358914 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.358924 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.642631 4883 generic.go:334] "Generic (PLEG): container finished" podID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerID="3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49" exitCode=0 Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.642696 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.642733 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bc48b486f-2j899" event={"ID":"77895d16-8ad3-4edb-ae91-d807afd499b3","Type":"ContainerDied","Data":"3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49"} Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.643611 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bc48b486f-2j899" event={"ID":"77895d16-8ad3-4edb-ae91-d807afd499b3","Type":"ContainerDied","Data":"3e2a2bca9130ffcd91c88897030c731f9414ca6f4f8e578189201420e38446dd"} Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.643638 4883 scope.go:117] "RemoveContainer" containerID="6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.648230 4883 generic.go:334] "Generic (PLEG): container finished" podID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerID="0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3" exitCode=143 Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.648315 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4853248e-5cd4-4cf3-b9e7-b824fad23efe","Type":"ContainerDied","Data":"0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3"} Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.650810 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a4664a7-ad8d-44ab-8f7f-d621e6b01899","Type":"ContainerStarted","Data":"95a8cd9db52de24a092ef958c5d1f9e08fab7d8af23ccfe700527e534ffb3136"} Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.666197 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerStarted","Data":"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c"} Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.672604 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.586911066 podStartE2EDuration="4.672578665s" podCreationTimestamp="2026-03-10 09:23:24 +0000 UTC" firstStartedPulling="2026-03-10 09:23:25.734236781 +0000 UTC m=+1191.989134671" lastFinishedPulling="2026-03-10 09:23:26.819904381 +0000 UTC m=+1193.074802270" observedRunningTime="2026-03-10 09:23:28.671185999 +0000 UTC m=+1194.926083888" watchObservedRunningTime="2026-03-10 09:23:28.672578665 +0000 UTC m=+1194.927476554" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.678223 4883 scope.go:117] "RemoveContainer" containerID="3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.698556 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bc48b486f-2j899"] Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.720495 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5bc48b486f-2j899"] Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.723531 4883 scope.go:117] "RemoveContainer" containerID="6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a" Mar 10 09:23:28 crc kubenswrapper[4883]: E0310 09:23:28.724293 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a\": container with ID starting with 6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a not found: ID does not exist" containerID="6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.724326 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a"} err="failed to get container status \"6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a\": rpc error: code = NotFound desc = could not find container \"6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a\": container with ID starting with 6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a not found: ID does not exist" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.724349 4883 scope.go:117] "RemoveContainer" containerID="3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49" Mar 10 09:23:28 crc kubenswrapper[4883]: E0310 09:23:28.725846 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49\": container with ID starting with 3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49 not found: ID does not exist" containerID="3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.725919 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49"} err="failed to get container status \"3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49\": rpc error: code = NotFound desc = could not find container \"3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49\": container with ID starting with 3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49 not found: ID does not exist" Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.708212 4883 generic.go:334] "Generic (PLEG): container finished" podID="a4909549-f2c4-45b0-a8f8-521302991297" containerID="ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317" exitCode=0 Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.708299 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcc9bbb48-lf4jb" event={"ID":"a4909549-f2c4-45b0-a8f8-521302991297","Type":"ContainerDied","Data":"ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317"} Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.785123 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerStarted","Data":"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6"} Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.787326 4883 generic.go:334] "Generic (PLEG): container finished" podID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerID="0e86e22acb39b181cf412775b1b1f806082c86689f0257c2525c32d95cd32424" exitCode=137 Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.787355 4883 generic.go:334] "Generic (PLEG): container finished" podID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerID="c9fb87caf677ab775fd09fb5b4c1268f29ff2c553286a49e90875b8415d4d3a9" exitCode=137 Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.787381 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-798c4d5785-ftwkg" event={"ID":"dedd724b-83f5-408a-b4d5-08adb5d71cc0","Type":"ContainerDied","Data":"0e86e22acb39b181cf412775b1b1f806082c86689f0257c2525c32d95cd32424"} Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.787464 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-798c4d5785-ftwkg" event={"ID":"dedd724b-83f5-408a-b4d5-08adb5d71cc0","Type":"ContainerDied","Data":"c9fb87caf677ab775fd09fb5b4c1268f29ff2c553286a49e90875b8415d4d3a9"} Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.057435 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.092585 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" path="/var/lib/kubelet/pods/77895d16-8ad3-4edb-ae91-d807afd499b3/volumes" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.102827 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-scripts\") pod \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.102928 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dedd724b-83f5-408a-b4d5-08adb5d71cc0-logs\") pod \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.102982 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-config-data\") pod \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.103000 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxrwr\" (UniqueName: \"kubernetes.io/projected/dedd724b-83f5-408a-b4d5-08adb5d71cc0-kube-api-access-fxrwr\") pod \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.103026 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dedd724b-83f5-408a-b4d5-08adb5d71cc0-horizon-secret-key\") pod \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.104077 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dedd724b-83f5-408a-b4d5-08adb5d71cc0-logs" (OuterVolumeSpecName: "logs") pod "dedd724b-83f5-408a-b4d5-08adb5d71cc0" (UID: "dedd724b-83f5-408a-b4d5-08adb5d71cc0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.108887 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dedd724b-83f5-408a-b4d5-08adb5d71cc0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dedd724b-83f5-408a-b4d5-08adb5d71cc0" (UID: "dedd724b-83f5-408a-b4d5-08adb5d71cc0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.114418 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dedd724b-83f5-408a-b4d5-08adb5d71cc0-kube-api-access-fxrwr" (OuterVolumeSpecName: "kube-api-access-fxrwr") pod "dedd724b-83f5-408a-b4d5-08adb5d71cc0" (UID: "dedd724b-83f5-408a-b4d5-08adb5d71cc0"). InnerVolumeSpecName "kube-api-access-fxrwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.125339 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-config-data" (OuterVolumeSpecName: "config-data") pod "dedd724b-83f5-408a-b4d5-08adb5d71cc0" (UID: "dedd724b-83f5-408a-b4d5-08adb5d71cc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.126330 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-scripts" (OuterVolumeSpecName: "scripts") pod "dedd724b-83f5-408a-b4d5-08adb5d71cc0" (UID: "dedd724b-83f5-408a-b4d5-08adb5d71cc0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.165143 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.204543 4883 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dedd724b-83f5-408a-b4d5-08adb5d71cc0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.204643 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.204701 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dedd724b-83f5-408a-b4d5-08adb5d71cc0-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.204749 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.204814 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxrwr\" (UniqueName: \"kubernetes.io/projected/dedd724b-83f5-408a-b4d5-08adb5d71cc0-kube-api-access-fxrwr\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.799246 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.800142 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-798c4d5785-ftwkg" event={"ID":"dedd724b-83f5-408a-b4d5-08adb5d71cc0","Type":"ContainerDied","Data":"b811a98b9eb641299a8060183de1afd8f605b6eb8c5e07f91e568070217a7cad"} Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.800295 4883 scope.go:117] "RemoveContainer" containerID="0e86e22acb39b181cf412775b1b1f806082c86689f0257c2525c32d95cd32424" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.932189 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-798c4d5785-ftwkg"] Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.938523 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-798c4d5785-ftwkg"] Mar 10 09:23:31 crc kubenswrapper[4883]: I0310 09:23:31.032777 4883 scope.go:117] "RemoveContainer" containerID="c9fb87caf677ab775fd09fb5b4c1268f29ff2c553286a49e90875b8415d4d3a9" Mar 10 09:23:31 crc kubenswrapper[4883]: I0310 09:23:31.262643 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fcc9bbb48-lf4jb" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 10 09:23:31 crc kubenswrapper[4883]: I0310 09:23:31.810569 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerStarted","Data":"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73"} Mar 10 09:23:31 crc kubenswrapper[4883]: I0310 09:23:31.836052 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.654828327 podStartE2EDuration="6.836031279s" podCreationTimestamp="2026-03-10 09:23:25 +0000 UTC" firstStartedPulling="2026-03-10 09:23:26.586767769 +0000 UTC m=+1192.841665657" lastFinishedPulling="2026-03-10 09:23:30.76797072 +0000 UTC m=+1197.022868609" observedRunningTime="2026-03-10 09:23:31.832289743 +0000 UTC m=+1198.087187632" watchObservedRunningTime="2026-03-10 09:23:31.836031279 +0000 UTC m=+1198.090929167" Mar 10 09:23:32 crc kubenswrapper[4883]: I0310 09:23:32.091200 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" path="/var/lib/kubelet/pods/dedd724b-83f5-408a-b4d5-08adb5d71cc0/volumes" Mar 10 09:23:32 crc kubenswrapper[4883]: I0310 09:23:32.820765 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.407203 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.463609 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.503216 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.580156 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tgr6b"] Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.580569 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerName="dnsmasq-dns" containerID="cri-o://25d1184c834395ecc1ce637374da9092da7e27ecf8a8400d81776463506396e2" gracePeriod=10 Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.862378 4883 generic.go:334] "Generic (PLEG): container finished" podID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerID="25d1184c834395ecc1ce637374da9092da7e27ecf8a8400d81776463506396e2" exitCode=0 Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.862668 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" event={"ID":"ce11c091-db9b-47fb-8427-dcaa2585a4c7","Type":"ContainerDied","Data":"25d1184c834395ecc1ce637374da9092da7e27ecf8a8400d81776463506396e2"} Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.862757 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="cinder-scheduler" containerID="cri-o://29e17f1955d24da31f1ef855d53d08193f96fe7c1aaf64e5d0776ebe40d1f3d7" gracePeriod=30 Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.862983 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="probe" containerID="cri-o://95a8cd9db52de24a092ef958c5d1f9e08fab7d8af23ccfe700527e534ffb3136" gracePeriod=30 Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.089073 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.217068 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmx5g\" (UniqueName: \"kubernetes.io/projected/ce11c091-db9b-47fb-8427-dcaa2585a4c7-kube-api-access-pmx5g\") pod \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.217191 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-swift-storage-0\") pod \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.217247 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-config\") pod \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.217270 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-nb\") pod \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.217290 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-sb\") pod \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.217366 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-svc\") pod \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.223004 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce11c091-db9b-47fb-8427-dcaa2585a4c7-kube-api-access-pmx5g" (OuterVolumeSpecName: "kube-api-access-pmx5g") pod "ce11c091-db9b-47fb-8427-dcaa2585a4c7" (UID: "ce11c091-db9b-47fb-8427-dcaa2585a4c7"). InnerVolumeSpecName "kube-api-access-pmx5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.252746 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce11c091-db9b-47fb-8427-dcaa2585a4c7" (UID: "ce11c091-db9b-47fb-8427-dcaa2585a4c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.255288 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce11c091-db9b-47fb-8427-dcaa2585a4c7" (UID: "ce11c091-db9b-47fb-8427-dcaa2585a4c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.256033 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce11c091-db9b-47fb-8427-dcaa2585a4c7" (UID: "ce11c091-db9b-47fb-8427-dcaa2585a4c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.261869 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce11c091-db9b-47fb-8427-dcaa2585a4c7" (UID: "ce11c091-db9b-47fb-8427-dcaa2585a4c7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.265010 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-config" (OuterVolumeSpecName: "config") pod "ce11c091-db9b-47fb-8427-dcaa2585a4c7" (UID: "ce11c091-db9b-47fb-8427-dcaa2585a4c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.321259 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmx5g\" (UniqueName: \"kubernetes.io/projected/ce11c091-db9b-47fb-8427-dcaa2585a4c7-kube-api-access-pmx5g\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.321288 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.321299 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.321310 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.321321 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.321330 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.873150 4883 generic.go:334] "Generic (PLEG): container finished" podID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerID="95a8cd9db52de24a092ef958c5d1f9e08fab7d8af23ccfe700527e534ffb3136" exitCode=0 Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.873274 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a4664a7-ad8d-44ab-8f7f-d621e6b01899","Type":"ContainerDied","Data":"95a8cd9db52de24a092ef958c5d1f9e08fab7d8af23ccfe700527e534ffb3136"} Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.875661 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" event={"ID":"ce11c091-db9b-47fb-8427-dcaa2585a4c7","Type":"ContainerDied","Data":"f2db0f853d6e758fe5cee704d792086fd294dc16742d1ad01bde933a91c3bcaa"} Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.875721 4883 scope.go:117] "RemoveContainer" containerID="25d1184c834395ecc1ce637374da9092da7e27ecf8a8400d81776463506396e2" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.875904 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.900009 4883 scope.go:117] "RemoveContainer" containerID="611c5c4793d5a469fa00a11e611df52aec3fea84115f5f16327469e87284b34b" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.919170 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tgr6b"] Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.926632 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tgr6b"] Mar 10 09:23:37 crc kubenswrapper[4883]: I0310 09:23:37.183930 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 09:23:37 crc kubenswrapper[4883]: I0310 09:23:37.886719 4883 generic.go:334] "Generic (PLEG): container finished" podID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerID="29e17f1955d24da31f1ef855d53d08193f96fe7c1aaf64e5d0776ebe40d1f3d7" exitCode=0 Mar 10 09:23:37 crc kubenswrapper[4883]: I0310 09:23:37.886998 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a4664a7-ad8d-44ab-8f7f-d621e6b01899","Type":"ContainerDied","Data":"29e17f1955d24da31f1ef855d53d08193f96fe7c1aaf64e5d0776ebe40d1f3d7"} Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.087843 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.092079 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" path="/var/lib/kubelet/pods/ce11c091-db9b-47fb-8427-dcaa2585a4c7/volumes" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.155833 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data\") pod \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.155871 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-scripts\") pod \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.155907 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-etc-machine-id\") pod \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.155934 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data-custom\") pod \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.156010 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-combined-ca-bundle\") pod \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.156030 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0a4664a7-ad8d-44ab-8f7f-d621e6b01899" (UID: "0a4664a7-ad8d-44ab-8f7f-d621e6b01899"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.156062 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4sgf\" (UniqueName: \"kubernetes.io/projected/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-kube-api-access-d4sgf\") pod \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.156435 4883 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.161375 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-scripts" (OuterVolumeSpecName: "scripts") pod "0a4664a7-ad8d-44ab-8f7f-d621e6b01899" (UID: "0a4664a7-ad8d-44ab-8f7f-d621e6b01899"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.161739 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-kube-api-access-d4sgf" (OuterVolumeSpecName: "kube-api-access-d4sgf") pod "0a4664a7-ad8d-44ab-8f7f-d621e6b01899" (UID: "0a4664a7-ad8d-44ab-8f7f-d621e6b01899"). InnerVolumeSpecName "kube-api-access-d4sgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.161895 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0a4664a7-ad8d-44ab-8f7f-d621e6b01899" (UID: "0a4664a7-ad8d-44ab-8f7f-d621e6b01899"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.192536 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a4664a7-ad8d-44ab-8f7f-d621e6b01899" (UID: "0a4664a7-ad8d-44ab-8f7f-d621e6b01899"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.231874 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data" (OuterVolumeSpecName: "config-data") pod "0a4664a7-ad8d-44ab-8f7f-d621e6b01899" (UID: "0a4664a7-ad8d-44ab-8f7f-d621e6b01899"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.259394 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4sgf\" (UniqueName: \"kubernetes.io/projected/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-kube-api-access-d4sgf\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.259433 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.259445 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.259454 4883 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.259465 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.896723 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a4664a7-ad8d-44ab-8f7f-d621e6b01899","Type":"ContainerDied","Data":"3b75442ea4aee0774586b8d43c5c36b051132ecdb5e3320c19060d614e90bf9c"} Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.896786 4883 scope.go:117] "RemoveContainer" containerID="95a8cd9db52de24a092ef958c5d1f9e08fab7d8af23ccfe700527e534ffb3136" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.896784 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.913728 4883 scope.go:117] "RemoveContainer" containerID="29e17f1955d24da31f1ef855d53d08193f96fe7c1aaf64e5d0776ebe40d1f3d7" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.922588 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.928006 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940506 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.940877 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940896 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.940911 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerName="dnsmasq-dns" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940917 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerName="dnsmasq-dns" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.940930 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon-log" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940936 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon-log" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.940955 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="cinder-scheduler" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940961 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="cinder-scheduler" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.940975 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-api" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940980 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-api" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.940988 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-httpd" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940993 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-httpd" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.941001 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941008 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.941017 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="probe" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941023 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="probe" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.941034 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerName="init" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941039 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerName="init" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.941046 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api-log" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941051 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api-log" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941202 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941216 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerName="dnsmasq-dns" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941223 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="cinder-scheduler" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941233 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon-log" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941242 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-httpd" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941250 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="probe" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941256 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941271 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-api" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941281 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api-log" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.942182 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.943712 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.953372 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.079861 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-config-data\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.079899 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-scripts\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.079980 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.080044 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btv4b\" (UniqueName: \"kubernetes.io/projected/a7bae0a1-9bb8-47ba-a161-764cd7406992-kube-api-access-btv4b\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.080160 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.080192 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7bae0a1-9bb8-47ba-a161-764cd7406992-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.181863 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.181932 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btv4b\" (UniqueName: \"kubernetes.io/projected/a7bae0a1-9bb8-47ba-a161-764cd7406992-kube-api-access-btv4b\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.182046 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.182078 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7bae0a1-9bb8-47ba-a161-764cd7406992-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.182358 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7bae0a1-9bb8-47ba-a161-764cd7406992-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.182866 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-config-data\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.182967 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-scripts\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.186902 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.187210 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.188058 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-scripts\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.188179 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-config-data\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.197911 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btv4b\" (UniqueName: \"kubernetes.io/projected/a7bae0a1-9bb8-47ba-a161-764cd7406992-kube-api-access-btv4b\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.258277 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.657849 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:39 crc kubenswrapper[4883]: W0310 09:23:39.661611 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7bae0a1_9bb8_47ba_a161_764cd7406992.slice/crio-658adbd61df5c334c9198f35c7aa39ec91f5a9e8aca298d8495188a80fe42f9a WatchSource:0}: Error finding container 658adbd61df5c334c9198f35c7aa39ec91f5a9e8aca298d8495188a80fe42f9a: Status 404 returned error can't find the container with id 658adbd61df5c334c9198f35c7aa39ec91f5a9e8aca298d8495188a80fe42f9a Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.909758 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a7bae0a1-9bb8-47ba-a161-764cd7406992","Type":"ContainerStarted","Data":"658adbd61df5c334c9198f35c7aa39ec91f5a9e8aca298d8495188a80fe42f9a"} Mar 10 09:23:40 crc kubenswrapper[4883]: I0310 09:23:40.097321 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" path="/var/lib/kubelet/pods/0a4664a7-ad8d-44ab-8f7f-d621e6b01899/volumes" Mar 10 09:23:40 crc kubenswrapper[4883]: I0310 09:23:40.864392 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:40 crc kubenswrapper[4883]: I0310 09:23:40.923237 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a7bae0a1-9bb8-47ba-a161-764cd7406992","Type":"ContainerStarted","Data":"05192ac9b45aeed45a69a1553b5262897db22a4255d969c7aec06c33ce478cce"} Mar 10 09:23:40 crc kubenswrapper[4883]: I0310 09:23:40.923284 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a7bae0a1-9bb8-47ba-a161-764cd7406992","Type":"ContainerStarted","Data":"66ce932d5a4da7e11095bb03edb1c911dda0fe72e21487c190608ef140a2dae6"} Mar 10 09:23:40 crc kubenswrapper[4883]: I0310 09:23:40.945031 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.945009183 podStartE2EDuration="2.945009183s" podCreationTimestamp="2026-03-10 09:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:40.941582752 +0000 UTC m=+1207.196480642" watchObservedRunningTime="2026-03-10 09:23:40.945009183 +0000 UTC m=+1207.199907073" Mar 10 09:23:41 crc kubenswrapper[4883]: I0310 09:23:41.261786 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fcc9bbb48-lf4jb" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.259194 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.499293 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.501807 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.505879 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/166b0c95-d44f-41e4-b27a-01e549dfb9d2-openstack-config-secret\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.506366 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/166b0c95-d44f-41e4-b27a-01e549dfb9d2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.506632 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/166b0c95-d44f-41e4-b27a-01e549dfb9d2-openstack-config\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.506770 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmqrd\" (UniqueName: \"kubernetes.io/projected/166b0c95-d44f-41e4-b27a-01e549dfb9d2-kube-api-access-qmqrd\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.509722 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.510797 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.510894 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.510916 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-642mt" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.608350 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/166b0c95-d44f-41e4-b27a-01e549dfb9d2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.608462 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/166b0c95-d44f-41e4-b27a-01e549dfb9d2-openstack-config\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.608515 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmqrd\" (UniqueName: \"kubernetes.io/projected/166b0c95-d44f-41e4-b27a-01e549dfb9d2-kube-api-access-qmqrd\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.608583 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/166b0c95-d44f-41e4-b27a-01e549dfb9d2-openstack-config-secret\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.612100 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/166b0c95-d44f-41e4-b27a-01e549dfb9d2-openstack-config\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.615748 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/166b0c95-d44f-41e4-b27a-01e549dfb9d2-openstack-config-secret\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.615938 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/166b0c95-d44f-41e4-b27a-01e549dfb9d2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.624864 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmqrd\" (UniqueName: \"kubernetes.io/projected/166b0c95-d44f-41e4-b27a-01e549dfb9d2-kube-api-access-qmqrd\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.822081 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 09:23:45 crc kubenswrapper[4883]: I0310 09:23:45.243068 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 09:23:45 crc kubenswrapper[4883]: I0310 09:23:45.966177 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"166b0c95-d44f-41e4-b27a-01e549dfb9d2","Type":"ContainerStarted","Data":"e27aa793a515e22a3bd9f4f5ac0b1c0c27191df3f76a0a58fdc118ca2a860551"} Mar 10 09:23:47 crc kubenswrapper[4883]: I0310 09:23:47.449076 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:23:47 crc kubenswrapper[4883]: I0310 09:23:47.449401 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:23:47 crc kubenswrapper[4883]: I0310 09:23:47.449456 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:23:47 crc kubenswrapper[4883]: I0310 09:23:47.450270 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"baa5421b80377ba5964153032e8442f67866ecaee882a6a6d6389530ab32d1e3"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:23:47 crc kubenswrapper[4883]: I0310 09:23:47.450324 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://baa5421b80377ba5964153032e8442f67866ecaee882a6a6d6389530ab32d1e3" gracePeriod=600 Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.001996 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="baa5421b80377ba5964153032e8442f67866ecaee882a6a6d6389530ab32d1e3" exitCode=0 Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.002163 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"baa5421b80377ba5964153032e8442f67866ecaee882a6a6d6389530ab32d1e3"} Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.002430 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"3173da1ffe8f7435768fc32262b7cd69b641562a52821e6d42f580eafd7ff225"} Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.002496 4883 scope.go:117] "RemoveContainer" containerID="7b763722fafe0669f89812126b4b9a1c6fdb578c65fcf3519f299d35b8f8264e" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.102135 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6656f7cc-nv5pp"] Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.104278 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6656f7cc-nv5pp"] Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.104380 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.106924 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.107406 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.112715 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.206026 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-config-data\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.206308 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-etc-swift\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.206486 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-run-httpd\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.206684 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-internal-tls-certs\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.206929 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-public-tls-certs\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.206963 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-log-httpd\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.207035 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-combined-ca-bundle\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.207136 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j55nn\" (UniqueName: \"kubernetes.io/projected/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-kube-api-access-j55nn\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.230182 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.230534 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-central-agent" containerID="cri-o://c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" gracePeriod=30 Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.231447 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="proxy-httpd" containerID="cri-o://c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" gracePeriod=30 Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.231552 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="sg-core" containerID="cri-o://77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" gracePeriod=30 Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.231629 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-notification-agent" containerID="cri-o://ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" gracePeriod=30 Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.249977 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.173:3000/\": EOF" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.308960 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-internal-tls-certs\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309108 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-public-tls-certs\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309133 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-log-httpd\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309177 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-combined-ca-bundle\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309224 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j55nn\" (UniqueName: \"kubernetes.io/projected/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-kube-api-access-j55nn\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309254 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-config-data\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309270 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-etc-swift\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309378 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-run-httpd\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.310196 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-run-httpd\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.314930 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-log-httpd\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.321205 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-internal-tls-certs\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.322884 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-public-tls-certs\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.324560 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-combined-ca-bundle\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.327966 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-etc-swift\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.328720 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j55nn\" (UniqueName: \"kubernetes.io/projected/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-kube-api-access-j55nn\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.339686 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-config-data\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.432271 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.926578 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.941443 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6656f7cc-nv5pp"] Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.014294 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6656f7cc-nv5pp" event={"ID":"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b","Type":"ContainerStarted","Data":"613e9b6262e6592360fde16de67b1c0eafb2ca168888dae279806bfa2fb4a2a8"} Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017808 4883 generic.go:334] "Generic (PLEG): container finished" podID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" exitCode=0 Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017833 4883 generic.go:334] "Generic (PLEG): container finished" podID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" exitCode=2 Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017842 4883 generic.go:334] "Generic (PLEG): container finished" podID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" exitCode=0 Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017849 4883 generic.go:334] "Generic (PLEG): container finished" podID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" exitCode=0 Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017886 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerDied","Data":"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73"} Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017907 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerDied","Data":"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6"} Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017917 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerDied","Data":"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c"} Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017926 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerDied","Data":"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de"} Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017941 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerDied","Data":"b152cd34b224458fb1ee138c83877e4d16208f40c78470f8afbb79dee1fb3eca"} Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017956 4883 scope.go:117] "RemoveContainer" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.018695 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034555 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-combined-ca-bundle\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034629 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-config-data\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034658 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-log-httpd\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034749 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xj82\" (UniqueName: \"kubernetes.io/projected/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-kube-api-access-4xj82\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034785 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-scripts\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034810 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-sg-core-conf-yaml\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034830 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-run-httpd\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.038690 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.038914 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.040572 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-scripts" (OuterVolumeSpecName: "scripts") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.042086 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-kube-api-access-4xj82" (OuterVolumeSpecName: "kube-api-access-4xj82") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "kube-api-access-4xj82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.045145 4883 scope.go:117] "RemoveContainer" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.075578 4883 scope.go:117] "RemoveContainer" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.075587 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.117377 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.137427 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xj82\" (UniqueName: \"kubernetes.io/projected/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-kube-api-access-4xj82\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.137453 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.137464 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.137503 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.137513 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.137522 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.155299 4883 scope.go:117] "RemoveContainer" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.218314 4883 scope.go:117] "RemoveContainer" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.220054 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-config-data" (OuterVolumeSpecName: "config-data") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.220227 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": container with ID starting with c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73 not found: ID does not exist" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.220251 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73"} err="failed to get container status \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": rpc error: code = NotFound desc = could not find container \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": container with ID starting with c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.220278 4883 scope.go:117] "RemoveContainer" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.220575 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": container with ID starting with 77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6 not found: ID does not exist" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.220718 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6"} err="failed to get container status \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": rpc error: code = NotFound desc = could not find container \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": container with ID starting with 77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.220737 4883 scope.go:117] "RemoveContainer" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.221006 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": container with ID starting with ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c not found: ID does not exist" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.221023 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c"} err="failed to get container status \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": rpc error: code = NotFound desc = could not find container \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": container with ID starting with ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.221048 4883 scope.go:117] "RemoveContainer" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.221436 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": container with ID starting with c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de not found: ID does not exist" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.221452 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de"} err="failed to get container status \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": rpc error: code = NotFound desc = could not find container \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": container with ID starting with c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.221526 4883 scope.go:117] "RemoveContainer" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.221773 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73"} err="failed to get container status \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": rpc error: code = NotFound desc = could not find container \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": container with ID starting with c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.221789 4883 scope.go:117] "RemoveContainer" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.222043 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6"} err="failed to get container status \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": rpc error: code = NotFound desc = could not find container \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": container with ID starting with 77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.222065 4883 scope.go:117] "RemoveContainer" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.222314 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c"} err="failed to get container status \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": rpc error: code = NotFound desc = could not find container \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": container with ID starting with ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.222337 4883 scope.go:117] "RemoveContainer" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.222600 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de"} err="failed to get container status \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": rpc error: code = NotFound desc = could not find container \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": container with ID starting with c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.222622 4883 scope.go:117] "RemoveContainer" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.223016 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73"} err="failed to get container status \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": rpc error: code = NotFound desc = could not find container \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": container with ID starting with c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.223031 4883 scope.go:117] "RemoveContainer" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.223431 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6"} err="failed to get container status \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": rpc error: code = NotFound desc = could not find container \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": container with ID starting with 77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.223445 4883 scope.go:117] "RemoveContainer" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.223991 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c"} err="failed to get container status \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": rpc error: code = NotFound desc = could not find container \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": container with ID starting with ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.224006 4883 scope.go:117] "RemoveContainer" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.224438 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de"} err="failed to get container status \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": rpc error: code = NotFound desc = could not find container \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": container with ID starting with c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.224451 4883 scope.go:117] "RemoveContainer" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.224858 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73"} err="failed to get container status \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": rpc error: code = NotFound desc = could not find container \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": container with ID starting with c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.224871 4883 scope.go:117] "RemoveContainer" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.225213 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6"} err="failed to get container status \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": rpc error: code = NotFound desc = could not find container \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": container with ID starting with 77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.225226 4883 scope.go:117] "RemoveContainer" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.225714 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c"} err="failed to get container status \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": rpc error: code = NotFound desc = could not find container \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": container with ID starting with ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.225732 4883 scope.go:117] "RemoveContainer" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.226402 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de"} err="failed to get container status \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": rpc error: code = NotFound desc = could not find container \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": container with ID starting with c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.238665 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.355674 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.363568 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.384281 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.384719 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="sg-core" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.384741 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="sg-core" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.384755 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="proxy-httpd" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.384761 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="proxy-httpd" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.384779 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-notification-agent" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.384785 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-notification-agent" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.384804 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-central-agent" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.384809 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-central-agent" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.384987 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="sg-core" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.385012 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-notification-agent" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.385021 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="proxy-httpd" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.385036 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-central-agent" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.387040 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.389373 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.395379 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.404091 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441507 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-config-data\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441591 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs4v8\" (UniqueName: \"kubernetes.io/projected/2aae9177-76f0-4502-8f6a-19ad69a255ae-kube-api-access-xs4v8\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441762 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-log-httpd\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441844 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441869 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441907 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-scripts\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441929 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-run-httpd\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.541448 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543447 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-config-data\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543560 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs4v8\" (UniqueName: \"kubernetes.io/projected/2aae9177-76f0-4502-8f6a-19ad69a255ae-kube-api-access-xs4v8\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543628 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-log-httpd\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543703 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543726 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543752 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-scripts\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543774 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-run-httpd\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.544139 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-log-httpd\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.544199 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-run-httpd\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.548259 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-config-data\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.547524 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.552096 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.552158 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-scripts\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.564075 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs4v8\" (UniqueName: \"kubernetes.io/projected/2aae9177-76f0-4502-8f6a-19ad69a255ae-kube-api-access-xs4v8\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.712291 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:50 crc kubenswrapper[4883]: I0310 09:23:50.045231 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6656f7cc-nv5pp" event={"ID":"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b","Type":"ContainerStarted","Data":"0ea21f69372c054943b2a3fa43db7fbcdd35a3e836454d8db14baa2b8075e60b"} Mar 10 09:23:50 crc kubenswrapper[4883]: I0310 09:23:50.045298 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6656f7cc-nv5pp" event={"ID":"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b","Type":"ContainerStarted","Data":"de0c2d43864189c52c0b910f86838a18fa45d493ba35f8abee881a9ffb22760b"} Mar 10 09:23:50 crc kubenswrapper[4883]: I0310 09:23:50.046818 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:50 crc kubenswrapper[4883]: I0310 09:23:50.046855 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:50 crc kubenswrapper[4883]: I0310 09:23:50.070409 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6656f7cc-nv5pp" podStartSLOduration=2.070388996 podStartE2EDuration="2.070388996s" podCreationTimestamp="2026-03-10 09:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:50.068272867 +0000 UTC m=+1216.323170746" watchObservedRunningTime="2026-03-10 09:23:50.070388996 +0000 UTC m=+1216.325286886" Mar 10 09:23:50 crc kubenswrapper[4883]: I0310 09:23:50.091180 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" path="/var/lib/kubelet/pods/aa5bf577-e25f-4df2-b088-d1b667ea1d0e/volumes" Mar 10 09:23:51 crc kubenswrapper[4883]: I0310 09:23:51.040461 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:51 crc kubenswrapper[4883]: I0310 09:23:51.042219 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:51 crc kubenswrapper[4883]: I0310 09:23:51.261662 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fcc9bbb48-lf4jb" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 10 09:23:51 crc kubenswrapper[4883]: I0310 09:23:51.262035 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:54 crc kubenswrapper[4883]: I0310 09:23:54.961248 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.027193 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.027673 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-log" containerID="cri-o://9942d2bf12f93ebc9815c517a342ac88745bd4ad2c848364cca8746de3f97630" gracePeriod=30 Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.027828 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-httpd" containerID="cri-o://a4d2619e8455e83dda94f13bca3f4af0a00dbc8f602e8d4ef336ec6c1e921f82" gracePeriod=30 Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.110391 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"166b0c95-d44f-41e4-b27a-01e549dfb9d2","Type":"ContainerStarted","Data":"a5721b4d9c7d6d5ef64c930764160a5476875b9aa466203bac010d9ce58c29b4"} Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.111673 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerStarted","Data":"8662768bfd83f0fff77bedd9babc38ac435600a470cdeb93306da3bcece7d468"} Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.132850 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.812426871 podStartE2EDuration="11.132838023s" podCreationTimestamp="2026-03-10 09:23:44 +0000 UTC" firstStartedPulling="2026-03-10 09:23:45.254642995 +0000 UTC m=+1211.509540884" lastFinishedPulling="2026-03-10 09:23:54.575054156 +0000 UTC m=+1220.829952036" observedRunningTime="2026-03-10 09:23:55.123691525 +0000 UTC m=+1221.378589404" watchObservedRunningTime="2026-03-10 09:23:55.132838023 +0000 UTC m=+1221.387735913" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.497747 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4vxd6"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.498835 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.510729 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4vxd6"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.601254 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-l9ldx"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.602721 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.605937 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bll5g\" (UniqueName: \"kubernetes.io/projected/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-kube-api-access-bll5g\") pod \"nova-api-db-create-4vxd6\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.606000 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-operator-scripts\") pod \"nova-api-db-create-4vxd6\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.613208 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f74b-account-create-update-lsxls"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.616180 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.618102 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.624401 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-l9ldx"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.637325 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f74b-account-create-update-lsxls"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.708306 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bll5g\" (UniqueName: \"kubernetes.io/projected/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-kube-api-access-bll5g\") pod \"nova-api-db-create-4vxd6\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.708353 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-operator-scripts\") pod \"nova-cell0-db-create-l9ldx\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.708381 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-operator-scripts\") pod \"nova-api-db-create-4vxd6\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.708418 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmdwp\" (UniqueName: \"kubernetes.io/projected/d355ddcd-9120-4436-84c4-928027e6ee33-kube-api-access-pmdwp\") pod \"nova-api-f74b-account-create-update-lsxls\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.708443 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqqsw\" (UniqueName: \"kubernetes.io/projected/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-kube-api-access-tqqsw\") pod \"nova-cell0-db-create-l9ldx\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.708529 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d355ddcd-9120-4436-84c4-928027e6ee33-operator-scripts\") pod \"nova-api-f74b-account-create-update-lsxls\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.709391 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-operator-scripts\") pod \"nova-api-db-create-4vxd6\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.709445 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-zr486"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.710659 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.717630 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zr486"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.729087 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bll5g\" (UniqueName: \"kubernetes.io/projected/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-kube-api-access-bll5g\") pod \"nova-api-db-create-4vxd6\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.811264 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-operator-scripts\") pod \"nova-cell0-db-create-l9ldx\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.811355 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmdwp\" (UniqueName: \"kubernetes.io/projected/d355ddcd-9120-4436-84c4-928027e6ee33-kube-api-access-pmdwp\") pod \"nova-api-f74b-account-create-update-lsxls\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.811391 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqqsw\" (UniqueName: \"kubernetes.io/projected/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-kube-api-access-tqqsw\") pod \"nova-cell0-db-create-l9ldx\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.811412 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e39def71-60ef-4b2a-823b-1c5e89e02647-operator-scripts\") pod \"nova-cell1-db-create-zr486\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.811452 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d355ddcd-9120-4436-84c4-928027e6ee33-operator-scripts\") pod \"nova-api-f74b-account-create-update-lsxls\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.811613 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzwb6\" (UniqueName: \"kubernetes.io/projected/e39def71-60ef-4b2a-823b-1c5e89e02647-kube-api-access-gzwb6\") pod \"nova-cell1-db-create-zr486\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.813103 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.814285 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-operator-scripts\") pod \"nova-cell0-db-create-l9ldx\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.815052 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d355ddcd-9120-4436-84c4-928027e6ee33-operator-scripts\") pod \"nova-api-f74b-account-create-update-lsxls\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.830388 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-15d6-account-create-update-lwjcj"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.831719 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.834051 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.844643 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-15d6-account-create-update-lwjcj"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.845027 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmdwp\" (UniqueName: \"kubernetes.io/projected/d355ddcd-9120-4436-84c4-928027e6ee33-kube-api-access-pmdwp\") pod \"nova-api-f74b-account-create-update-lsxls\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.853124 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqqsw\" (UniqueName: \"kubernetes.io/projected/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-kube-api-access-tqqsw\") pod \"nova-cell0-db-create-l9ldx\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.880258 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.880859 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-log" containerID="cri-o://feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731" gracePeriod=30 Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.881387 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-httpd" containerID="cri-o://93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032" gracePeriod=30 Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.913413 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzwb6\" (UniqueName: \"kubernetes.io/projected/e39def71-60ef-4b2a-823b-1c5e89e02647-kube-api-access-gzwb6\") pod \"nova-cell1-db-create-zr486\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.913496 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-operator-scripts\") pod \"nova-cell0-15d6-account-create-update-lwjcj\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.913576 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd5d5\" (UniqueName: \"kubernetes.io/projected/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-kube-api-access-vd5d5\") pod \"nova-cell0-15d6-account-create-update-lwjcj\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.913701 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e39def71-60ef-4b2a-823b-1c5e89e02647-operator-scripts\") pod \"nova-cell1-db-create-zr486\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.914593 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e39def71-60ef-4b2a-823b-1c5e89e02647-operator-scripts\") pod \"nova-cell1-db-create-zr486\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.945963 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzwb6\" (UniqueName: \"kubernetes.io/projected/e39def71-60ef-4b2a-823b-1c5e89e02647-kube-api-access-gzwb6\") pod \"nova-cell1-db-create-zr486\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.959701 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.964021 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.992776 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015298 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-combined-ca-bundle\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015451 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-scripts\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015496 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-config-data\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015551 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-tls-certs\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015666 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scthm\" (UniqueName: \"kubernetes.io/projected/a4909549-f2c4-45b0-a8f8-521302991297-kube-api-access-scthm\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015739 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4909549-f2c4-45b0-a8f8-521302991297-logs\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015797 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-secret-key\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.016283 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-operator-scripts\") pod \"nova-cell0-15d6-account-create-update-lwjcj\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.016423 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd5d5\" (UniqueName: \"kubernetes.io/projected/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-kube-api-access-vd5d5\") pod \"nova-cell0-15d6-account-create-update-lwjcj\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.022961 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-operator-scripts\") pod \"nova-cell0-15d6-account-create-update-lwjcj\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.025316 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4909549-f2c4-45b0-a8f8-521302991297-logs" (OuterVolumeSpecName: "logs") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.027560 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4909549-f2c4-45b0-a8f8-521302991297-kube-api-access-scthm" (OuterVolumeSpecName: "kube-api-access-scthm") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "kube-api-access-scthm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.038229 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.074901 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.111057 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd5d5\" (UniqueName: \"kubernetes.io/projected/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-kube-api-access-vd5d5\") pod \"nova-cell0-15d6-account-create-update-lwjcj\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.168122 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scthm\" (UniqueName: \"kubernetes.io/projected/a4909549-f2c4-45b0-a8f8-521302991297-kube-api-access-scthm\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.168155 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4909549-f2c4-45b0-a8f8-521302991297-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.168167 4883 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.201382 4883 generic.go:334] "Generic (PLEG): container finished" podID="f9967357-b98f-4e31-9934-f99669b31024" containerID="feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731" exitCode=143 Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.221151 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.221153 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f9967357-b98f-4e31-9934-f99669b31024","Type":"ContainerDied","Data":"feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731"} Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.221279 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.230691 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c052-account-create-update-hg4pd"] Mar 10 09:23:56 crc kubenswrapper[4883]: E0310 09:23:56.234871 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon-log" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.234891 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon-log" Mar 10 09:23:56 crc kubenswrapper[4883]: E0310 09:23:56.234920 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.234927 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.235250 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.235280 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon-log" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.236341 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.238323 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.240837 4883 generic.go:334] "Generic (PLEG): container finished" podID="a4909549-f2c4-45b0-a8f8-521302991297" containerID="5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff" exitCode=137 Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.240990 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.241038 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcc9bbb48-lf4jb" event={"ID":"a4909549-f2c4-45b0-a8f8-521302991297","Type":"ContainerDied","Data":"5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff"} Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.241066 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcc9bbb48-lf4jb" event={"ID":"a4909549-f2c4-45b0-a8f8-521302991297","Type":"ContainerDied","Data":"023eb4b942e99478ddc5c2302dbb0ec5737ecdcfe04fb54667164182410590d3"} Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.241088 4883 scope.go:117] "RemoveContainer" containerID="ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.241960 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.245974 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c052-account-create-update-hg4pd"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.273183 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-config-data" (OuterVolumeSpecName: "config-data") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.273309 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.278624 4883 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.278653 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.278665 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.280308 4883 generic.go:334] "Generic (PLEG): container finished" podID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerID="9942d2bf12f93ebc9815c517a342ac88745bd4ad2c848364cca8746de3f97630" exitCode=143 Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.280458 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3","Type":"ContainerDied","Data":"9942d2bf12f93ebc9815c517a342ac88745bd4ad2c848364cca8746de3f97630"} Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.292694 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-scripts" (OuterVolumeSpecName: "scripts") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.299203 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerStarted","Data":"be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7"} Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.338730 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f6f8846bd-rdwfd"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.338964 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f6f8846bd-rdwfd" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-api" containerID="cri-o://02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d" gracePeriod=30 Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.339382 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f6f8846bd-rdwfd" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-httpd" containerID="cri-o://5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4" gracePeriod=30 Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.380757 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9dd286b-6aa5-4525-a645-8e4ec79af348-operator-scripts\") pod \"nova-cell1-c052-account-create-update-hg4pd\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.381177 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxh9p\" (UniqueName: \"kubernetes.io/projected/e9dd286b-6aa5-4525-a645-8e4ec79af348-kube-api-access-bxh9p\") pod \"nova-cell1-c052-account-create-update-hg4pd\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.381792 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.483402 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxh9p\" (UniqueName: \"kubernetes.io/projected/e9dd286b-6aa5-4525-a645-8e4ec79af348-kube-api-access-bxh9p\") pod \"nova-cell1-c052-account-create-update-hg4pd\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.483760 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9dd286b-6aa5-4525-a645-8e4ec79af348-operator-scripts\") pod \"nova-cell1-c052-account-create-update-hg4pd\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.484999 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9dd286b-6aa5-4525-a645-8e4ec79af348-operator-scripts\") pod \"nova-cell1-c052-account-create-update-hg4pd\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.508008 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxh9p\" (UniqueName: \"kubernetes.io/projected/e9dd286b-6aa5-4525-a645-8e4ec79af348-kube-api-access-bxh9p\") pod \"nova-cell1-c052-account-create-update-hg4pd\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.556929 4883 scope.go:117] "RemoveContainer" containerID="5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.582856 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fcc9bbb48-lf4jb"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.591052 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5fcc9bbb48-lf4jb"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.604404 4883 scope.go:117] "RemoveContainer" containerID="ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317" Mar 10 09:23:56 crc kubenswrapper[4883]: E0310 09:23:56.609580 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317\": container with ID starting with ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317 not found: ID does not exist" containerID="ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.609650 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317"} err="failed to get container status \"ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317\": rpc error: code = NotFound desc = could not find container \"ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317\": container with ID starting with ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317 not found: ID does not exist" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.609697 4883 scope.go:117] "RemoveContainer" containerID="5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff" Mar 10 09:23:56 crc kubenswrapper[4883]: E0310 09:23:56.610865 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff\": container with ID starting with 5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff not found: ID does not exist" containerID="5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.610898 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff"} err="failed to get container status \"5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff\": rpc error: code = NotFound desc = could not find container \"5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff\": container with ID starting with 5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff not found: ID does not exist" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.675112 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.748984 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f74b-account-create-update-lsxls"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.763371 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-l9ldx"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.919690 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4vxd6"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.936788 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zr486"] Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.056504 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-15d6-account-create-update-lwjcj"] Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.185345 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c052-account-create-update-hg4pd"] Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.310677 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" event={"ID":"e9dd286b-6aa5-4525-a645-8e4ec79af348","Type":"ContainerStarted","Data":"328d6abd25cfa7a71e9b0831cc5309de56da4eea4f989a50a43c9c33525a5879"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.313603 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f74b-account-create-update-lsxls" event={"ID":"d355ddcd-9120-4436-84c4-928027e6ee33","Type":"ContainerStarted","Data":"ca65d438bf33bf28ae2760bd85500809892b5530a19b9249afc6d3b1e6b1e723"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.313704 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f74b-account-create-update-lsxls" event={"ID":"d355ddcd-9120-4436-84c4-928027e6ee33","Type":"ContainerStarted","Data":"6d8e8a7ffeaf42a416ff3d5dced177bcd0777c66e72676ad42d73d1bfa28b123"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.316449 4883 generic.go:334] "Generic (PLEG): container finished" podID="e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" containerID="edddf942ff54cf02d31c8d37d1a93a850752455b76c3f9b8d5acabfd5e985820" exitCode=0 Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.316516 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-l9ldx" event={"ID":"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8","Type":"ContainerDied","Data":"edddf942ff54cf02d31c8d37d1a93a850752455b76c3f9b8d5acabfd5e985820"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.316573 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-l9ldx" event={"ID":"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8","Type":"ContainerStarted","Data":"458dceb3e1cca2ec00511601cd3c6de401cd9e82c4e2ccda5ef9b21bd6f813bb"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.322596 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4vxd6" event={"ID":"fdbd0859-6f93-4118-9e5b-2170ec3d43ad","Type":"ContainerStarted","Data":"d23bd9bd934e3d99aa1bfc6ccd981d2c32f179368b140bc3665f8538d2c19637"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.322679 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4vxd6" event={"ID":"fdbd0859-6f93-4118-9e5b-2170ec3d43ad","Type":"ContainerStarted","Data":"8ad2396f2e7a9d3bbf9da675e807414641fe972e08a0b92dada5b43eb2016b98"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.328102 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" event={"ID":"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9","Type":"ContainerStarted","Data":"e1764b4c527a37975cc4e8ef8e724e3a447bf2698dc21042ec02c05dc2aa830c"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.328147 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" event={"ID":"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9","Type":"ContainerStarted","Data":"e69d19127eb3d9ce4f3e34fa37eba123f18c99b6eaeaed465228d938b7758035"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.331361 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-f74b-account-create-update-lsxls" podStartSLOduration=2.33134565 podStartE2EDuration="2.33134565s" podCreationTimestamp="2026-03-10 09:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:57.324623205 +0000 UTC m=+1223.579521094" watchObservedRunningTime="2026-03-10 09:23:57.33134565 +0000 UTC m=+1223.586243540" Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.331984 4883 generic.go:334] "Generic (PLEG): container finished" podID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerID="5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4" exitCode=0 Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.332044 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6f8846bd-rdwfd" event={"ID":"de4a8a41-06f6-4d5a-939c-22eebc30b0d8","Type":"ContainerDied","Data":"5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.336089 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerStarted","Data":"21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.338200 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zr486" event={"ID":"e39def71-60ef-4b2a-823b-1c5e89e02647","Type":"ContainerStarted","Data":"d00a8e52c275926dd98e27f733cab6f0434bb178955532902b15a5e7ab5d08f8"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.338251 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zr486" event={"ID":"e39def71-60ef-4b2a-823b-1c5e89e02647","Type":"ContainerStarted","Data":"5149fd889fe5536a323a71b17dfaf7c9c619e260b3b58f02d0982a4f00153649"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.357197 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-4vxd6" podStartSLOduration=2.357177011 podStartE2EDuration="2.357177011s" podCreationTimestamp="2026-03-10 09:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:57.354162035 +0000 UTC m=+1223.609059925" watchObservedRunningTime="2026-03-10 09:23:57.357177011 +0000 UTC m=+1223.612074900" Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.371772 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" podStartSLOduration=2.371753097 podStartE2EDuration="2.371753097s" podCreationTimestamp="2026-03-10 09:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:57.364066833 +0000 UTC m=+1223.618964722" watchObservedRunningTime="2026-03-10 09:23:57.371753097 +0000 UTC m=+1223.626650986" Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.415436 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-zr486" podStartSLOduration=2.415408528 podStartE2EDuration="2.415408528s" podCreationTimestamp="2026-03-10 09:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:57.399463931 +0000 UTC m=+1223.654361820" watchObservedRunningTime="2026-03-10 09:23:57.415408528 +0000 UTC m=+1223.670306417" Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.739871 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.101116 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4909549-f2c4-45b0-a8f8-521302991297" path="/var/lib/kubelet/pods/a4909549-f2c4-45b0-a8f8-521302991297/volumes" Mar 10 09:23:58 crc kubenswrapper[4883]: E0310 09:23:58.167265 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9dd286b_6aa5_4525_a645_8e4ec79af348.slice/crio-911a3a55df750ade4baf157ba45ca477442b34e3cdb85095ba2fd8c7ca80b8fe.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.272351 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354036 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data-custom\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354126 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-combined-ca-bundle\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354153 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354226 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22fbf\" (UniqueName: \"kubernetes.io/projected/4853248e-5cd4-4cf3-b9e7-b824fad23efe-kube-api-access-22fbf\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354280 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4853248e-5cd4-4cf3-b9e7-b824fad23efe-etc-machine-id\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354374 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4853248e-5cd4-4cf3-b9e7-b824fad23efe-logs\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354512 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-scripts\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354660 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4853248e-5cd4-4cf3-b9e7-b824fad23efe-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.355007 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4853248e-5cd4-4cf3-b9e7-b824fad23efe-logs" (OuterVolumeSpecName: "logs") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.356684 4883 generic.go:334] "Generic (PLEG): container finished" podID="e39def71-60ef-4b2a-823b-1c5e89e02647" containerID="d00a8e52c275926dd98e27f733cab6f0434bb178955532902b15a5e7ab5d08f8" exitCode=0 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.356774 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zr486" event={"ID":"e39def71-60ef-4b2a-823b-1c5e89e02647","Type":"ContainerDied","Data":"d00a8e52c275926dd98e27f733cab6f0434bb178955532902b15a5e7ab5d08f8"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.357546 4883 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4853248e-5cd4-4cf3-b9e7-b824fad23efe-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.357561 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4853248e-5cd4-4cf3-b9e7-b824fad23efe-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.359885 4883 generic.go:334] "Generic (PLEG): container finished" podID="e9dd286b-6aa5-4525-a645-8e4ec79af348" containerID="911a3a55df750ade4baf157ba45ca477442b34e3cdb85095ba2fd8c7ca80b8fe" exitCode=0 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.359932 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" event={"ID":"e9dd286b-6aa5-4525-a645-8e4ec79af348","Type":"ContainerDied","Data":"911a3a55df750ade4baf157ba45ca477442b34e3cdb85095ba2fd8c7ca80b8fe"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.360188 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-scripts" (OuterVolumeSpecName: "scripts") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.360399 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.361439 4883 generic.go:334] "Generic (PLEG): container finished" podID="f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" containerID="e1764b4c527a37975cc4e8ef8e724e3a447bf2698dc21042ec02c05dc2aa830c" exitCode=0 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.361506 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" event={"ID":"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9","Type":"ContainerDied","Data":"e1764b4c527a37975cc4e8ef8e724e3a447bf2698dc21042ec02c05dc2aa830c"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.364620 4883 generic.go:334] "Generic (PLEG): container finished" podID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerID="a4d2619e8455e83dda94f13bca3f4af0a00dbc8f602e8d4ef336ec6c1e921f82" exitCode=0 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.364664 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3","Type":"ContainerDied","Data":"a4d2619e8455e83dda94f13bca3f4af0a00dbc8f602e8d4ef336ec6c1e921f82"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.371011 4883 generic.go:334] "Generic (PLEG): container finished" podID="d355ddcd-9120-4436-84c4-928027e6ee33" containerID="ca65d438bf33bf28ae2760bd85500809892b5530a19b9249afc6d3b1e6b1e723" exitCode=0 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.371086 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f74b-account-create-update-lsxls" event={"ID":"d355ddcd-9120-4436-84c4-928027e6ee33","Type":"ContainerDied","Data":"ca65d438bf33bf28ae2760bd85500809892b5530a19b9249afc6d3b1e6b1e723"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.371763 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4853248e-5cd4-4cf3-b9e7-b824fad23efe-kube-api-access-22fbf" (OuterVolumeSpecName: "kube-api-access-22fbf") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "kube-api-access-22fbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.377254 4883 generic.go:334] "Generic (PLEG): container finished" podID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerID="4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5" exitCode=137 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.377326 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4853248e-5cd4-4cf3-b9e7-b824fad23efe","Type":"ContainerDied","Data":"4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.377363 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4853248e-5cd4-4cf3-b9e7-b824fad23efe","Type":"ContainerDied","Data":"987bc0cc38389c3b6190190ae5c16e3637e3022a5ee37c4c3bc24573be51664c"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.377384 4883 scope.go:117] "RemoveContainer" containerID="4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.377510 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.384331 4883 generic.go:334] "Generic (PLEG): container finished" podID="fdbd0859-6f93-4118-9e5b-2170ec3d43ad" containerID="d23bd9bd934e3d99aa1bfc6ccd981d2c32f179368b140bc3665f8538d2c19637" exitCode=0 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.384427 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4vxd6" event={"ID":"fdbd0859-6f93-4118-9e5b-2170ec3d43ad","Type":"ContainerDied","Data":"d23bd9bd934e3d99aa1bfc6ccd981d2c32f179368b140bc3665f8538d2c19637"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.386086 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerStarted","Data":"e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.393591 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.432559 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data" (OuterVolumeSpecName: "config-data") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.438209 4883 scope.go:117] "RemoveContainer" containerID="0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.454838 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.457167 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.458552 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.458566 4883 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.458575 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.458583 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.458592 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22fbf\" (UniqueName: \"kubernetes.io/projected/4853248e-5cd4-4cf3-b9e7-b824fad23efe-kube-api-access-22fbf\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.478204 4883 scope.go:117] "RemoveContainer" containerID="4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5" Mar 10 09:23:58 crc kubenswrapper[4883]: E0310 09:23:58.478600 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5\": container with ID starting with 4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5 not found: ID does not exist" containerID="4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.478635 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5"} err="failed to get container status \"4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5\": rpc error: code = NotFound desc = could not find container \"4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5\": container with ID starting with 4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5 not found: ID does not exist" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.478654 4883 scope.go:117] "RemoveContainer" containerID="0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3" Mar 10 09:23:58 crc kubenswrapper[4883]: E0310 09:23:58.478844 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3\": container with ID starting with 0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3 not found: ID does not exist" containerID="0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.478859 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3"} err="failed to get container status \"0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3\": rpc error: code = NotFound desc = could not find container \"0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3\": container with ID starting with 0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3 not found: ID does not exist" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.738041 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.750309 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.770704 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:58 crc kubenswrapper[4883]: E0310 09:23:58.771402 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api-log" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.771417 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api-log" Mar 10 09:23:58 crc kubenswrapper[4883]: E0310 09:23:58.771444 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.771465 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.771962 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api-log" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.771992 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.773151 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.777843 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.777912 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.778132 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.785392 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.853510 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.869757 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2caba5f6-d05e-437e-868c-952e8adf3278-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.869816 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870025 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-scripts\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870157 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2caba5f6-d05e-437e-868c-952e8adf3278-logs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870192 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgmvk\" (UniqueName: \"kubernetes.io/projected/2caba5f6-d05e-437e-868c-952e8adf3278-kube-api-access-tgmvk\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870217 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-config-data-custom\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870266 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870315 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-config-data\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870333 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.874429 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971565 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971702 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-httpd-run\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971823 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-public-tls-certs\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971854 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-operator-scripts\") pod \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971872 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-config-data\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971890 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-combined-ca-bundle\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971919 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqqsw\" (UniqueName: \"kubernetes.io/projected/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-kube-api-access-tqqsw\") pod \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971975 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8mhb\" (UniqueName: \"kubernetes.io/projected/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-kube-api-access-z8mhb\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972041 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-scripts\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972063 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-logs\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972561 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-scripts\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972654 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2caba5f6-d05e-437e-868c-952e8adf3278-logs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972684 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgmvk\" (UniqueName: \"kubernetes.io/projected/2caba5f6-d05e-437e-868c-952e8adf3278-kube-api-access-tgmvk\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972704 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-config-data-custom\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972727 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972755 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-config-data\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972769 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972791 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2caba5f6-d05e-437e-868c-952e8adf3278-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972816 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.973035 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.973412 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2caba5f6-d05e-437e-868c-952e8adf3278-logs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.976889 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-kube-api-access-z8mhb" (OuterVolumeSpecName: "kube-api-access-z8mhb") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "kube-api-access-z8mhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.978090 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" (UID: "e694b4cb-0aa6-46d5-b6be-039d6a92e4a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.983874 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.983935 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2caba5f6-d05e-437e-868c-952e8adf3278-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.984284 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-logs" (OuterVolumeSpecName: "logs") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.984740 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-kube-api-access-tqqsw" (OuterVolumeSpecName: "kube-api-access-tqqsw") pod "e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" (UID: "e694b4cb-0aa6-46d5-b6be-039d6a92e4a8"). InnerVolumeSpecName "kube-api-access-tqqsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.985804 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.991405 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.991736 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-config-data\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.992270 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-config-data-custom\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.993121 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.995954 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-scripts\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.995990 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-scripts" (OuterVolumeSpecName: "scripts") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.996017 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgmvk\" (UniqueName: \"kubernetes.io/projected/2caba5f6-d05e-437e-868c-952e8adf3278-kube-api-access-tgmvk\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.033652 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.041324 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-config-data" (OuterVolumeSpecName: "config-data") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.071399 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077200 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077224 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077237 4883 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077250 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077259 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077267 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077276 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqqsw\" (UniqueName: \"kubernetes.io/projected/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-kube-api-access-tqqsw\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077287 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8mhb\" (UniqueName: \"kubernetes.io/projected/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-kube-api-access-z8mhb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077296 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077303 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.092454 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.163886 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.179637 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.403320 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-l9ldx" event={"ID":"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8","Type":"ContainerDied","Data":"458dceb3e1cca2ec00511601cd3c6de401cd9e82c4e2ccda5ef9b21bd6f813bb"} Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.403378 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="458dceb3e1cca2ec00511601cd3c6de401cd9e82c4e2ccda5ef9b21bd6f813bb" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.403468 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.409359 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3","Type":"ContainerDied","Data":"4d280b27ae76beef2731a3863818dd720d9ca5f105e0f710f7b3f7d025052c9f"} Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.409417 4883 scope.go:117] "RemoveContainer" containerID="a4d2619e8455e83dda94f13bca3f4af0a00dbc8f602e8d4ef336ec6c1e921f82" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.409672 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.449413 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.469176 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.476447 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:59 crc kubenswrapper[4883]: E0310 09:23:59.476897 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" containerName="mariadb-database-create" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.476916 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" containerName="mariadb-database-create" Mar 10 09:23:59 crc kubenswrapper[4883]: E0310 09:23:59.476925 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-httpd" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.476932 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-httpd" Mar 10 09:23:59 crc kubenswrapper[4883]: E0310 09:23:59.476947 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-log" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.476953 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-log" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.477163 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" containerName="mariadb-database-create" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.477184 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-httpd" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.477194 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-log" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.478159 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.482825 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.482973 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.486329 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.528954 4883 scope.go:117] "RemoveContainer" containerID="9942d2bf12f93ebc9815c517a342ac88745bd4ad2c848364cca8746de3f97630" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.592164 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594073 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-logs\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594194 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlr2j\" (UniqueName: \"kubernetes.io/projected/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-kube-api-access-zlr2j\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594283 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594342 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594384 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594645 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594694 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.632857 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:59 crc kubenswrapper[4883]: W0310 09:23:59.656402 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2caba5f6_d05e_437e_868c_952e8adf3278.slice/crio-601c1138d87e47570d65bae4260a979f42c8ffaf3e8bce68acb093f5669efb1b WatchSource:0}: Error finding container 601c1138d87e47570d65bae4260a979f42c8ffaf3e8bce68acb093f5669efb1b: Status 404 returned error can't find the container with id 601c1138d87e47570d65bae4260a979f42c8ffaf3e8bce68acb093f5669efb1b Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.700135 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.700202 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.700236 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.701036 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.701101 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-logs\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.701164 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlr2j\" (UniqueName: \"kubernetes.io/projected/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-kube-api-access-zlr2j\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.701217 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.701251 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.702623 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.702922 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.705214 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-logs\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.705776 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.716253 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.723723 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.727995 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.729710 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlr2j\" (UniqueName: \"kubernetes.io/projected/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-kube-api-access-zlr2j\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.750922 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.790766 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.807911 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.909642 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d355ddcd-9120-4436-84c4-928027e6ee33-operator-scripts\") pod \"d355ddcd-9120-4436-84c4-928027e6ee33\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.909994 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmdwp\" (UniqueName: \"kubernetes.io/projected/d355ddcd-9120-4436-84c4-928027e6ee33-kube-api-access-pmdwp\") pod \"d355ddcd-9120-4436-84c4-928027e6ee33\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.911982 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d355ddcd-9120-4436-84c4-928027e6ee33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d355ddcd-9120-4436-84c4-928027e6ee33" (UID: "d355ddcd-9120-4436-84c4-928027e6ee33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.933784 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d355ddcd-9120-4436-84c4-928027e6ee33-kube-api-access-pmdwp" (OuterVolumeSpecName: "kube-api-access-pmdwp") pod "d355ddcd-9120-4436-84c4-928027e6ee33" (UID: "d355ddcd-9120-4436-84c4-928027e6ee33"). InnerVolumeSpecName "kube-api-access-pmdwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.016716 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d355ddcd-9120-4436-84c4-928027e6ee33-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.016748 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmdwp\" (UniqueName: \"kubernetes.io/projected/d355ddcd-9120-4436-84c4-928027e6ee33-kube-api-access-pmdwp\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.067886 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.073520 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.076662 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.091933 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.094761 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" path="/var/lib/kubelet/pods/4853248e-5cd4-4cf3-b9e7-b824fad23efe/volumes" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.095720 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" path="/var/lib/kubelet/pods/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3/volumes" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.149372 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552244-zwxrg"] Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.149816 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.149836 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.149850 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdbd0859-6f93-4118-9e5b-2170ec3d43ad" containerName="mariadb-database-create" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.149857 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdbd0859-6f93-4118-9e5b-2170ec3d43ad" containerName="mariadb-database-create" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.149870 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9dd286b-6aa5-4525-a645-8e4ec79af348" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.149878 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9dd286b-6aa5-4525-a645-8e4ec79af348" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.149897 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e39def71-60ef-4b2a-823b-1c5e89e02647" containerName="mariadb-database-create" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.149902 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e39def71-60ef-4b2a-823b-1c5e89e02647" containerName="mariadb-database-create" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.149926 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d355ddcd-9120-4436-84c4-928027e6ee33" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.149933 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d355ddcd-9120-4436-84c4-928027e6ee33" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.150178 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d355ddcd-9120-4436-84c4-928027e6ee33" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.150221 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.150232 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9dd286b-6aa5-4525-a645-8e4ec79af348" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.150246 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e39def71-60ef-4b2a-823b-1c5e89e02647" containerName="mariadb-database-create" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.150265 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdbd0859-6f93-4118-9e5b-2170ec3d43ad" containerName="mariadb-database-create" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.151136 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.154858 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.155016 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.155224 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.156060 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552244-zwxrg"] Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.213421 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.221114 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bll5g\" (UniqueName: \"kubernetes.io/projected/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-kube-api-access-bll5g\") pod \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.228000 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxh9p\" (UniqueName: \"kubernetes.io/projected/e9dd286b-6aa5-4525-a645-8e4ec79af348-kube-api-access-bxh9p\") pod \"e9dd286b-6aa5-4525-a645-8e4ec79af348\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.233532 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd5d5\" (UniqueName: \"kubernetes.io/projected/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-kube-api-access-vd5d5\") pod \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.233617 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-operator-scripts\") pod \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.234394 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e39def71-60ef-4b2a-823b-1c5e89e02647-operator-scripts\") pod \"e39def71-60ef-4b2a-823b-1c5e89e02647\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.234520 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-operator-scripts\") pod \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.234571 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzwb6\" (UniqueName: \"kubernetes.io/projected/e39def71-60ef-4b2a-823b-1c5e89e02647-kube-api-access-gzwb6\") pod \"e39def71-60ef-4b2a-823b-1c5e89e02647\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.234800 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9dd286b-6aa5-4525-a645-8e4ec79af348-operator-scripts\") pod \"e9dd286b-6aa5-4525-a645-8e4ec79af348\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.235425 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvpgx\" (UniqueName: \"kubernetes.io/projected/391543cc-519b-4e01-8886-04bde62c5298-kube-api-access-hvpgx\") pod \"auto-csr-approver-29552244-zwxrg\" (UID: \"391543cc-519b-4e01-8886-04bde62c5298\") " pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.236319 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e39def71-60ef-4b2a-823b-1c5e89e02647-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e39def71-60ef-4b2a-823b-1c5e89e02647" (UID: "e39def71-60ef-4b2a-823b-1c5e89e02647"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.236884 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fdbd0859-6f93-4118-9e5b-2170ec3d43ad" (UID: "fdbd0859-6f93-4118-9e5b-2170ec3d43ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.237552 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" (UID: "f9eca221-08d6-4b22-8d5c-1cd9e95c65d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.238460 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9dd286b-6aa5-4525-a645-8e4ec79af348-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9dd286b-6aa5-4525-a645-8e4ec79af348" (UID: "e9dd286b-6aa5-4525-a645-8e4ec79af348"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.239581 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-kube-api-access-bll5g" (OuterVolumeSpecName: "kube-api-access-bll5g") pod "fdbd0859-6f93-4118-9e5b-2170ec3d43ad" (UID: "fdbd0859-6f93-4118-9e5b-2170ec3d43ad"). InnerVolumeSpecName "kube-api-access-bll5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.253707 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-kube-api-access-vd5d5" (OuterVolumeSpecName: "kube-api-access-vd5d5") pod "f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" (UID: "f9eca221-08d6-4b22-8d5c-1cd9e95c65d9"). InnerVolumeSpecName "kube-api-access-vd5d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.257917 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e39def71-60ef-4b2a-823b-1c5e89e02647-kube-api-access-gzwb6" (OuterVolumeSpecName: "kube-api-access-gzwb6") pod "e39def71-60ef-4b2a-823b-1c5e89e02647" (UID: "e39def71-60ef-4b2a-823b-1c5e89e02647"). InnerVolumeSpecName "kube-api-access-gzwb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.271138 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9dd286b-6aa5-4525-a645-8e4ec79af348-kube-api-access-bxh9p" (OuterVolumeSpecName: "kube-api-access-bxh9p") pod "e9dd286b-6aa5-4525-a645-8e4ec79af348" (UID: "e9dd286b-6aa5-4525-a645-8e4ec79af348"). InnerVolumeSpecName "kube-api-access-bxh9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337411 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-config-data\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337550 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-logs\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337595 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337638 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-httpd-run\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337816 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-internal-tls-certs\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337874 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-scripts\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337954 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-logs" (OuterVolumeSpecName: "logs") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.338057 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-combined-ca-bundle\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.338181 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff6w4\" (UniqueName: \"kubernetes.io/projected/f9967357-b98f-4e31-9934-f99669b31024-kube-api-access-ff6w4\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.338199 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342105 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvpgx\" (UniqueName: \"kubernetes.io/projected/391543cc-519b-4e01-8886-04bde62c5298-kube-api-access-hvpgx\") pod \"auto-csr-approver-29552244-zwxrg\" (UID: \"391543cc-519b-4e01-8886-04bde62c5298\") " pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342373 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342396 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342410 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9dd286b-6aa5-4525-a645-8e4ec79af348-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342422 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bll5g\" (UniqueName: \"kubernetes.io/projected/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-kube-api-access-bll5g\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342436 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxh9p\" (UniqueName: \"kubernetes.io/projected/e9dd286b-6aa5-4525-a645-8e4ec79af348-kube-api-access-bxh9p\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342447 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd5d5\" (UniqueName: \"kubernetes.io/projected/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-kube-api-access-vd5d5\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342458 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342467 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e39def71-60ef-4b2a-823b-1c5e89e02647-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342494 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342504 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzwb6\" (UniqueName: \"kubernetes.io/projected/e39def71-60ef-4b2a-823b-1c5e89e02647-kube-api-access-gzwb6\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.362218 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.369586 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvpgx\" (UniqueName: \"kubernetes.io/projected/391543cc-519b-4e01-8886-04bde62c5298-kube-api-access-hvpgx\") pod \"auto-csr-approver-29552244-zwxrg\" (UID: \"391543cc-519b-4e01-8886-04bde62c5298\") " pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.388433 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-scripts" (OuterVolumeSpecName: "scripts") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.401269 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9967357-b98f-4e31-9934-f99669b31024-kube-api-access-ff6w4" (OuterVolumeSpecName: "kube-api-access-ff6w4") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "kube-api-access-ff6w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.420981 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.449011 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" event={"ID":"e9dd286b-6aa5-4525-a645-8e4ec79af348","Type":"ContainerDied","Data":"328d6abd25cfa7a71e9b0831cc5309de56da4eea4f989a50a43c9c33525a5879"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.449084 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="328d6abd25cfa7a71e9b0831cc5309de56da4eea4f989a50a43c9c33525a5879" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.449231 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453051 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff6w4\" (UniqueName: \"kubernetes.io/projected/f9967357-b98f-4e31-9934-f99669b31024-kube-api-access-ff6w4\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453522 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453585 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453658 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453815 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453819 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f74b-account-create-update-lsxls" event={"ID":"d355ddcd-9120-4436-84c4-928027e6ee33","Type":"ContainerDied","Data":"6d8e8a7ffeaf42a416ff3d5dced177bcd0777c66e72676ad42d73d1bfa28b123"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453864 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d8e8a7ffeaf42a416ff3d5dced177bcd0777c66e72676ad42d73d1bfa28b123" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.461733 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerStarted","Data":"f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.461905 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-central-agent" containerID="cri-o://be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7" gracePeriod=30 Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.462009 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.462045 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="proxy-httpd" containerID="cri-o://f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db" gracePeriod=30 Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.462118 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-notification-agent" containerID="cri-o://21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a" gracePeriod=30 Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.462231 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="sg-core" containerID="cri-o://e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b" gracePeriod=30 Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.482565 4883 generic.go:334] "Generic (PLEG): container finished" podID="f9967357-b98f-4e31-9934-f99669b31024" containerID="93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032" exitCode=0 Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.482655 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f9967357-b98f-4e31-9934-f99669b31024","Type":"ContainerDied","Data":"93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.482680 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f9967357-b98f-4e31-9934-f99669b31024","Type":"ContainerDied","Data":"71b700d498775842c5c3b3c9b8a10f0292a828bd6a69eebba134bc667b2b5df6"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.482725 4883 scope.go:117] "RemoveContainer" containerID="93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.482904 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.491044 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.491548 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.495199 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.498435 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.888122954 podStartE2EDuration="11.498422377s" podCreationTimestamp="2026-03-10 09:23:49 +0000 UTC" firstStartedPulling="2026-03-10 09:23:54.97045563 +0000 UTC m=+1221.225353520" lastFinishedPulling="2026-03-10 09:23:59.580755054 +0000 UTC m=+1225.835652943" observedRunningTime="2026-03-10 09:24:00.483805014 +0000 UTC m=+1226.738702903" watchObservedRunningTime="2026-03-10 09:24:00.498422377 +0000 UTC m=+1226.753320266" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.510706 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zr486" event={"ID":"e39def71-60ef-4b2a-823b-1c5e89e02647","Type":"ContainerDied","Data":"5149fd889fe5536a323a71b17dfaf7c9c619e260b3b58f02d0982a4f00153649"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.510757 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5149fd889fe5536a323a71b17dfaf7c9c619e260b3b58f02d0982a4f00153649" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.510841 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.513327 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-config-data" (OuterVolumeSpecName: "config-data") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.513530 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2caba5f6-d05e-437e-868c-952e8adf3278","Type":"ContainerStarted","Data":"601c1138d87e47570d65bae4260a979f42c8ffaf3e8bce68acb093f5669efb1b"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.520349 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4vxd6" event={"ID":"fdbd0859-6f93-4118-9e5b-2170ec3d43ad","Type":"ContainerDied","Data":"8ad2396f2e7a9d3bbf9da675e807414641fe972e08a0b92dada5b43eb2016b98"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.520388 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ad2396f2e7a9d3bbf9da675e807414641fe972e08a0b92dada5b43eb2016b98" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.520444 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.527811 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" event={"ID":"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9","Type":"ContainerDied","Data":"e69d19127eb3d9ce4f3e34fa37eba123f18c99b6eaeaed465228d938b7758035"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.527841 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e69d19127eb3d9ce4f3e34fa37eba123f18c99b6eaeaed465228d938b7758035" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.528124 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.528946 4883 scope.go:117] "RemoveContainer" containerID="feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.529754 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.566965 4883 scope.go:117] "RemoveContainer" containerID="93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.567383 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032\": container with ID starting with 93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032 not found: ID does not exist" containerID="93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.567418 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032"} err="failed to get container status \"93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032\": rpc error: code = NotFound desc = could not find container \"93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032\": container with ID starting with 93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032 not found: ID does not exist" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.567460 4883 scope.go:117] "RemoveContainer" containerID="feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.567966 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731\": container with ID starting with feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731 not found: ID does not exist" containerID="feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.568072 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731"} err="failed to get container status \"feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731\": rpc error: code = NotFound desc = could not find container \"feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731\": container with ID starting with feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731 not found: ID does not exist" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.571772 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.571795 4883 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.571806 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.853898 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.864522 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.878073 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.878623 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-httpd" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.878644 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-httpd" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.878674 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-log" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.878681 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-log" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.878867 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-httpd" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.878901 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-log" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.879959 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.881973 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.882382 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.882541 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.999762 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfhwv\" (UniqueName: \"kubernetes.io/projected/676458e7-e4a0-4f1a-b200-0ab75faaddb4-kube-api-access-gfhwv\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.999839 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/676458e7-e4a0-4f1a-b200-0ab75faaddb4-logs\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.999870 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/676458e7-e4a0-4f1a-b200-0ab75faaddb4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.000083 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.000248 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.000404 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.000437 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.000507 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.027712 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552244-zwxrg"] Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.103305 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/676458e7-e4a0-4f1a-b200-0ab75faaddb4-logs\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.105724 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/676458e7-e4a0-4f1a-b200-0ab75faaddb4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.105995 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.106086 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.106180 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.106203 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.106246 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.106716 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhwv\" (UniqueName: \"kubernetes.io/projected/676458e7-e4a0-4f1a-b200-0ab75faaddb4-kube-api-access-gfhwv\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.103992 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/676458e7-e4a0-4f1a-b200-0ab75faaddb4-logs\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.108122 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/676458e7-e4a0-4f1a-b200-0ab75faaddb4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.110152 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.125681 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.144699 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.145466 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.151056 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfhwv\" (UniqueName: \"kubernetes.io/projected/676458e7-e4a0-4f1a-b200-0ab75faaddb4-kube-api-access-gfhwv\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.152614 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.169565 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.210422 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.273977 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.415009 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-ovndb-tls-certs\") pod \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.415494 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc9sx\" (UniqueName: \"kubernetes.io/projected/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-kube-api-access-xc9sx\") pod \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.415674 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-config\") pod \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.415760 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-httpd-config\") pod \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.415804 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-combined-ca-bundle\") pod \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.421393 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-kube-api-access-xc9sx" (OuterVolumeSpecName: "kube-api-access-xc9sx") pod "de4a8a41-06f6-4d5a-939c-22eebc30b0d8" (UID: "de4a8a41-06f6-4d5a-939c-22eebc30b0d8"). InnerVolumeSpecName "kube-api-access-xc9sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.430375 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "de4a8a41-06f6-4d5a-939c-22eebc30b0d8" (UID: "de4a8a41-06f6-4d5a-939c-22eebc30b0d8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.468863 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-config" (OuterVolumeSpecName: "config") pod "de4a8a41-06f6-4d5a-939c-22eebc30b0d8" (UID: "de4a8a41-06f6-4d5a-939c-22eebc30b0d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.483841 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de4a8a41-06f6-4d5a-939c-22eebc30b0d8" (UID: "de4a8a41-06f6-4d5a-939c-22eebc30b0d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.508665 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "de4a8a41-06f6-4d5a-939c-22eebc30b0d8" (UID: "de4a8a41-06f6-4d5a-939c-22eebc30b0d8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.518715 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc9sx\" (UniqueName: \"kubernetes.io/projected/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-kube-api-access-xc9sx\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.518742 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.518752 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.518761 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.518771 4883 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.559256 4883 generic.go:334] "Generic (PLEG): container finished" podID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerID="f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db" exitCode=0 Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.559300 4883 generic.go:334] "Generic (PLEG): container finished" podID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerID="e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b" exitCode=2 Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.559309 4883 generic.go:334] "Generic (PLEG): container finished" podID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerID="21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a" exitCode=0 Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.559358 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerDied","Data":"f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.559387 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerDied","Data":"e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.559400 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerDied","Data":"21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.573945 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0","Type":"ContainerStarted","Data":"2d927a827f1b57a3c74a0466ad13ecf698a963090190150471c91fa25975bf4f"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.574008 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0","Type":"ContainerStarted","Data":"a2dbead07466ce40aa853155f09d4924420bedf2219e3d697667f7310f78ef12"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.577940 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" event={"ID":"391543cc-519b-4e01-8886-04bde62c5298","Type":"ContainerStarted","Data":"f33d45e49c487b98183faf1adc73353bffd3605e2e4034356d70d18beabca3f0"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.583861 4883 generic.go:334] "Generic (PLEG): container finished" podID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerID="02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d" exitCode=0 Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.583925 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6f8846bd-rdwfd" event={"ID":"de4a8a41-06f6-4d5a-939c-22eebc30b0d8","Type":"ContainerDied","Data":"02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.583955 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6f8846bd-rdwfd" event={"ID":"de4a8a41-06f6-4d5a-939c-22eebc30b0d8","Type":"ContainerDied","Data":"98d045da7fa92d2ea6ec832a583b37763ca71714b8c37b66a7d614c5c8099df1"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.583977 4883 scope.go:117] "RemoveContainer" containerID="5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.583994 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.607992 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2caba5f6-d05e-437e-868c-952e8adf3278","Type":"ContainerStarted","Data":"e7708b6c2d732bd1f8d0d9576d52f4af9f622df129248fbf8312d95b061e492f"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.608021 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2caba5f6-d05e-437e-868c-952e8adf3278","Type":"ContainerStarted","Data":"eda0dc97c6e82a19cf11402d6161e9b3e6ba6c1878c7e9895582b4a78c155d76"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.608129 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.661218 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.6611959450000002 podStartE2EDuration="3.661195945s" podCreationTimestamp="2026-03-10 09:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:01.64080287 +0000 UTC m=+1227.895700759" watchObservedRunningTime="2026-03-10 09:24:01.661195945 +0000 UTC m=+1227.916093834" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.679204 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f6f8846bd-rdwfd"] Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.681096 4883 scope.go:117] "RemoveContainer" containerID="02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.687705 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f6f8846bd-rdwfd"] Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.708646 4883 scope.go:117] "RemoveContainer" containerID="5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4" Mar 10 09:24:01 crc kubenswrapper[4883]: E0310 09:24:01.712669 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4\": container with ID starting with 5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4 not found: ID does not exist" containerID="5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.712727 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4"} err="failed to get container status \"5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4\": rpc error: code = NotFound desc = could not find container \"5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4\": container with ID starting with 5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4 not found: ID does not exist" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.712761 4883 scope.go:117] "RemoveContainer" containerID="02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d" Mar 10 09:24:01 crc kubenswrapper[4883]: E0310 09:24:01.716561 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d\": container with ID starting with 02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d not found: ID does not exist" containerID="02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.716598 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d"} err="failed to get container status \"02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d\": rpc error: code = NotFound desc = could not find container \"02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d\": container with ID starting with 02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d not found: ID does not exist" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.749891 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:24:01 crc kubenswrapper[4883]: W0310 09:24:01.758601 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod676458e7_e4a0_4f1a_b200_0ab75faaddb4.slice/crio-28dccf89d163e87b333b06a1d113b6744cf5ce84a4006385fb3291cad97fab95 WatchSource:0}: Error finding container 28dccf89d163e87b333b06a1d113b6744cf5ce84a4006385fb3291cad97fab95: Status 404 returned error can't find the container with id 28dccf89d163e87b333b06a1d113b6744cf5ce84a4006385fb3291cad97fab95 Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.098970 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" path="/var/lib/kubelet/pods/de4a8a41-06f6-4d5a-939c-22eebc30b0d8/volumes" Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.100043 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9967357-b98f-4e31-9934-f99669b31024" path="/var/lib/kubelet/pods/f9967357-b98f-4e31-9934-f99669b31024/volumes" Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.622964 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"676458e7-e4a0-4f1a-b200-0ab75faaddb4","Type":"ContainerStarted","Data":"beb67b616015fe9791d1ac986df0b2ba15e2cdca44d7b610c14f4bb40905ea5e"} Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.623251 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"676458e7-e4a0-4f1a-b200-0ab75faaddb4","Type":"ContainerStarted","Data":"28dccf89d163e87b333b06a1d113b6744cf5ce84a4006385fb3291cad97fab95"} Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.627147 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0","Type":"ContainerStarted","Data":"18e79717e974d3ac224da6f6ea6c6f16e46988561d4afeb1e5d571670e122dbd"} Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.635614 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" event={"ID":"391543cc-519b-4e01-8886-04bde62c5298","Type":"ContainerStarted","Data":"6ae90501c695dbc91d1432637dfb47eec6c627c8a17eb905f14422cd13154c3c"} Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.645222 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.645202203 podStartE2EDuration="3.645202203s" podCreationTimestamp="2026-03-10 09:23:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:02.644739581 +0000 UTC m=+1228.899637470" watchObservedRunningTime="2026-03-10 09:24:02.645202203 +0000 UTC m=+1228.900100092" Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.658836 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" podStartSLOduration=1.663565262 podStartE2EDuration="2.65881887s" podCreationTimestamp="2026-03-10 09:24:00 +0000 UTC" firstStartedPulling="2026-03-10 09:24:01.042406682 +0000 UTC m=+1227.297304571" lastFinishedPulling="2026-03-10 09:24:02.037660289 +0000 UTC m=+1228.292558179" observedRunningTime="2026-03-10 09:24:02.656821243 +0000 UTC m=+1228.911719132" watchObservedRunningTime="2026-03-10 09:24:02.65881887 +0000 UTC m=+1228.913716759" Mar 10 09:24:03 crc kubenswrapper[4883]: I0310 09:24:03.647172 4883 generic.go:334] "Generic (PLEG): container finished" podID="391543cc-519b-4e01-8886-04bde62c5298" containerID="6ae90501c695dbc91d1432637dfb47eec6c627c8a17eb905f14422cd13154c3c" exitCode=0 Mar 10 09:24:03 crc kubenswrapper[4883]: I0310 09:24:03.647244 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" event={"ID":"391543cc-519b-4e01-8886-04bde62c5298","Type":"ContainerDied","Data":"6ae90501c695dbc91d1432637dfb47eec6c627c8a17eb905f14422cd13154c3c"} Mar 10 09:24:03 crc kubenswrapper[4883]: I0310 09:24:03.650170 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"676458e7-e4a0-4f1a-b200-0ab75faaddb4","Type":"ContainerStarted","Data":"02a407f3ae262acf186ff2a707a604e3fc6f0360578ee27f45091e9469bb630c"} Mar 10 09:24:03 crc kubenswrapper[4883]: I0310 09:24:03.679888 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.679868761 podStartE2EDuration="3.679868761s" podCreationTimestamp="2026-03-10 09:24:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:03.676853495 +0000 UTC m=+1229.931751385" watchObservedRunningTime="2026-03-10 09:24:03.679868761 +0000 UTC m=+1229.934766649" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.215826 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.405996 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-scripts\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406062 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs4v8\" (UniqueName: \"kubernetes.io/projected/2aae9177-76f0-4502-8f6a-19ad69a255ae-kube-api-access-xs4v8\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406092 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-run-httpd\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406117 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-combined-ca-bundle\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406140 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-config-data\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406157 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-sg-core-conf-yaml\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406178 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-log-httpd\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406869 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.414022 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.417907 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aae9177-76f0-4502-8f6a-19ad69a255ae-kube-api-access-xs4v8" (OuterVolumeSpecName: "kube-api-access-xs4v8") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "kube-api-access-xs4v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.434521 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-scripts" (OuterVolumeSpecName: "scripts") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.441210 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.463188 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.486670 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-config-data" (OuterVolumeSpecName: "config-data") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510157 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510193 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs4v8\" (UniqueName: \"kubernetes.io/projected/2aae9177-76f0-4502-8f6a-19ad69a255ae-kube-api-access-xs4v8\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510231 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510245 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510255 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510266 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510276 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.664542 4883 generic.go:334] "Generic (PLEG): container finished" podID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerID="be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7" exitCode=0 Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.664671 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.664750 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerDied","Data":"be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7"} Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.664796 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerDied","Data":"8662768bfd83f0fff77bedd9babc38ac435600a470cdeb93306da3bcece7d468"} Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.664837 4883 scope.go:117] "RemoveContainer" containerID="f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.696649 4883 scope.go:117] "RemoveContainer" containerID="e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.715719 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.730952 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737166 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.737619 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="proxy-httpd" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737638 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="proxy-httpd" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.737651 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-notification-agent" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737657 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-notification-agent" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.737667 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-central-agent" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737673 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-central-agent" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.737690 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-api" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737696 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-api" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.737704 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-httpd" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737709 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-httpd" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.737718 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="sg-core" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737725 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="sg-core" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737895 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-httpd" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737913 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-api" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737922 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="proxy-httpd" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737931 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-notification-agent" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737941 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="sg-core" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737952 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-central-agent" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.738525 4883 scope.go:117] "RemoveContainer" containerID="21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.746922 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.747059 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.749538 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.749792 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.776582 4883 scope.go:117] "RemoveContainer" containerID="be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.811370 4883 scope.go:117] "RemoveContainer" containerID="f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.812327 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db\": container with ID starting with f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db not found: ID does not exist" containerID="f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.812375 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db"} err="failed to get container status \"f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db\": rpc error: code = NotFound desc = could not find container \"f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db\": container with ID starting with f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db not found: ID does not exist" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.812406 4883 scope.go:117] "RemoveContainer" containerID="e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.813003 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b\": container with ID starting with e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b not found: ID does not exist" containerID="e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.813046 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b"} err="failed to get container status \"e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b\": rpc error: code = NotFound desc = could not find container \"e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b\": container with ID starting with e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b not found: ID does not exist" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.813063 4883 scope.go:117] "RemoveContainer" containerID="21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.813420 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a\": container with ID starting with 21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a not found: ID does not exist" containerID="21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.813447 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a"} err="failed to get container status \"21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a\": rpc error: code = NotFound desc = could not find container \"21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a\": container with ID starting with 21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a not found: ID does not exist" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.813463 4883 scope.go:117] "RemoveContainer" containerID="be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.813908 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7\": container with ID starting with be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7 not found: ID does not exist" containerID="be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.813950 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7"} err="failed to get container status \"be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7\": rpc error: code = NotFound desc = could not find container \"be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7\": container with ID starting with be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7 not found: ID does not exist" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921130 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-run-httpd\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921221 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921282 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-config-data\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921305 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-log-httpd\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921341 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxwp2\" (UniqueName: \"kubernetes.io/projected/0f8485ae-b380-4555-8f4a-a71544094774-kube-api-access-sxwp2\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921365 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921902 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-scripts\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.968942 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024533 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-run-httpd\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024678 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024717 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-log-httpd\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024740 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-config-data\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024776 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxwp2\" (UniqueName: \"kubernetes.io/projected/0f8485ae-b380-4555-8f4a-a71544094774-kube-api-access-sxwp2\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024812 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024888 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-scripts\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.025497 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-run-httpd\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.025518 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-log-httpd\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.031652 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.032525 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-config-data\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.033061 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-scripts\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.034994 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.045616 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxwp2\" (UniqueName: \"kubernetes.io/projected/0f8485ae-b380-4555-8f4a-a71544094774-kube-api-access-sxwp2\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.065989 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.126431 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvpgx\" (UniqueName: \"kubernetes.io/projected/391543cc-519b-4e01-8886-04bde62c5298-kube-api-access-hvpgx\") pod \"391543cc-519b-4e01-8886-04bde62c5298\" (UID: \"391543cc-519b-4e01-8886-04bde62c5298\") " Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.130507 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/391543cc-519b-4e01-8886-04bde62c5298-kube-api-access-hvpgx" (OuterVolumeSpecName: "kube-api-access-hvpgx") pod "391543cc-519b-4e01-8886-04bde62c5298" (UID: "391543cc-519b-4e01-8886-04bde62c5298"). InnerVolumeSpecName "kube-api-access-hvpgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.231744 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvpgx\" (UniqueName: \"kubernetes.io/projected/391543cc-519b-4e01-8886-04bde62c5298-kube-api-access-hvpgx\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.511291 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:05 crc kubenswrapper[4883]: W0310 09:24:05.514727 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f8485ae_b380_4555_8f4a_a71544094774.slice/crio-b4032a0a620f17cd3f9ee7e418f4d03002b4d54e275f86e2be2b1e53a4c27ebe WatchSource:0}: Error finding container b4032a0a620f17cd3f9ee7e418f4d03002b4d54e275f86e2be2b1e53a4c27ebe: Status 404 returned error can't find the container with id b4032a0a620f17cd3f9ee7e418f4d03002b4d54e275f86e2be2b1e53a4c27ebe Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.679536 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" event={"ID":"391543cc-519b-4e01-8886-04bde62c5298","Type":"ContainerDied","Data":"f33d45e49c487b98183faf1adc73353bffd3605e2e4034356d70d18beabca3f0"} Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.679615 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f33d45e49c487b98183faf1adc73353bffd3605e2e4034356d70d18beabca3f0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.679553 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.691277 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerStarted","Data":"b4032a0a620f17cd3f9ee7e418f4d03002b4d54e275f86e2be2b1e53a4c27ebe"} Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.718987 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552238-qbbs2"] Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.728219 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552238-qbbs2"] Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.090723 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" path="/var/lib/kubelet/pods/2aae9177-76f0-4502-8f6a-19ad69a255ae/volumes" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.092023 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38171111-f624-438d-ba5a-36f6b9cb29bf" path="/var/lib/kubelet/pods/38171111-f624-438d-ba5a-36f6b9cb29bf/volumes" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.251497 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pghj7"] Mar 10 09:24:06 crc kubenswrapper[4883]: E0310 09:24:06.252189 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391543cc-519b-4e01-8886-04bde62c5298" containerName="oc" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.252209 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="391543cc-519b-4e01-8886-04bde62c5298" containerName="oc" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.252402 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="391543cc-519b-4e01-8886-04bde62c5298" containerName="oc" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.254322 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.256874 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.257344 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.270229 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hmfjf" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.280035 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pghj7"] Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.354906 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-config-data\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.355119 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nmp8\" (UniqueName: \"kubernetes.io/projected/46c8e962-9007-49e1-bd9f-d822e9100291-kube-api-access-4nmp8\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.355149 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.355178 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-scripts\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.456674 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nmp8\" (UniqueName: \"kubernetes.io/projected/46c8e962-9007-49e1-bd9f-d822e9100291-kube-api-access-4nmp8\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.456720 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.456753 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-scripts\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.456830 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-config-data\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.462460 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-scripts\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.463281 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-config-data\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.470140 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.484875 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nmp8\" (UniqueName: \"kubernetes.io/projected/46c8e962-9007-49e1-bd9f-d822e9100291-kube-api-access-4nmp8\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.588782 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.703707 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerStarted","Data":"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279"} Mar 10 09:24:07 crc kubenswrapper[4883]: I0310 09:24:07.001686 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pghj7"] Mar 10 09:24:07 crc kubenswrapper[4883]: W0310 09:24:07.013661 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46c8e962_9007_49e1_bd9f_d822e9100291.slice/crio-476b4874dc1fbb37d06665527cae7ca6640ff7b5cda30363d5e889fac818a771 WatchSource:0}: Error finding container 476b4874dc1fbb37d06665527cae7ca6640ff7b5cda30363d5e889fac818a771: Status 404 returned error can't find the container with id 476b4874dc1fbb37d06665527cae7ca6640ff7b5cda30363d5e889fac818a771 Mar 10 09:24:07 crc kubenswrapper[4883]: I0310 09:24:07.714831 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pghj7" event={"ID":"46c8e962-9007-49e1-bd9f-d822e9100291","Type":"ContainerStarted","Data":"476b4874dc1fbb37d06665527cae7ca6640ff7b5cda30363d5e889fac818a771"} Mar 10 09:24:07 crc kubenswrapper[4883]: I0310 09:24:07.716797 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerStarted","Data":"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578"} Mar 10 09:24:07 crc kubenswrapper[4883]: I0310 09:24:07.716870 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerStarted","Data":"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3"} Mar 10 09:24:09 crc kubenswrapper[4883]: I0310 09:24:09.809246 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 09:24:09 crc kubenswrapper[4883]: I0310 09:24:09.809551 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 09:24:09 crc kubenswrapper[4883]: I0310 09:24:09.851775 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 09:24:09 crc kubenswrapper[4883]: I0310 09:24:09.854885 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 09:24:10 crc kubenswrapper[4883]: I0310 09:24:10.760559 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerStarted","Data":"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9"} Mar 10 09:24:10 crc kubenswrapper[4883]: I0310 09:24:10.761014 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 09:24:10 crc kubenswrapper[4883]: I0310 09:24:10.761036 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 09:24:10 crc kubenswrapper[4883]: I0310 09:24:10.761049 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:24:10 crc kubenswrapper[4883]: I0310 09:24:10.792680 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.686281202 podStartE2EDuration="6.792650822s" podCreationTimestamp="2026-03-10 09:24:04 +0000 UTC" firstStartedPulling="2026-03-10 09:24:05.517829641 +0000 UTC m=+1231.772727520" lastFinishedPulling="2026-03-10 09:24:09.624199252 +0000 UTC m=+1235.879097140" observedRunningTime="2026-03-10 09:24:10.779432216 +0000 UTC m=+1237.034330105" watchObservedRunningTime="2026-03-10 09:24:10.792650822 +0000 UTC m=+1237.047548701" Mar 10 09:24:10 crc kubenswrapper[4883]: I0310 09:24:10.912882 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 09:24:11 crc kubenswrapper[4883]: I0310 09:24:11.211810 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:11 crc kubenswrapper[4883]: I0310 09:24:11.212080 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:11 crc kubenswrapper[4883]: I0310 09:24:11.245625 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:11 crc kubenswrapper[4883]: I0310 09:24:11.247041 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:11 crc kubenswrapper[4883]: I0310 09:24:11.784246 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:11 crc kubenswrapper[4883]: I0310 09:24:11.784338 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:12 crc kubenswrapper[4883]: I0310 09:24:12.511047 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 09:24:12 crc kubenswrapper[4883]: I0310 09:24:12.625334 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 09:24:13 crc kubenswrapper[4883]: I0310 09:24:13.793706 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:13 crc kubenswrapper[4883]: I0310 09:24:13.827376 4883 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 09:24:13 crc kubenswrapper[4883]: I0310 09:24:13.957885 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:14 crc kubenswrapper[4883]: I0310 09:24:14.917726 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:14 crc kubenswrapper[4883]: I0310 09:24:14.918027 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-central-agent" containerID="cri-o://0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" gracePeriod=30 Mar 10 09:24:14 crc kubenswrapper[4883]: I0310 09:24:14.918583 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="proxy-httpd" containerID="cri-o://fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" gracePeriod=30 Mar 10 09:24:14 crc kubenswrapper[4883]: I0310 09:24:14.918662 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="sg-core" containerID="cri-o://d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" gracePeriod=30 Mar 10 09:24:14 crc kubenswrapper[4883]: I0310 09:24:14.918710 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-notification-agent" containerID="cri-o://766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" gracePeriod=30 Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.556748 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.666705 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-run-httpd\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.667118 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-combined-ca-bundle\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.667268 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-config-data\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.668004 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-scripts\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.668175 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-sg-core-conf-yaml\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.668238 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxwp2\" (UniqueName: \"kubernetes.io/projected/0f8485ae-b380-4555-8f4a-a71544094774-kube-api-access-sxwp2\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.668354 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-log-httpd\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.668963 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.669030 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.670725 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.670755 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.676337 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-scripts" (OuterVolumeSpecName: "scripts") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.679186 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f8485ae-b380-4555-8f4a-a71544094774-kube-api-access-sxwp2" (OuterVolumeSpecName: "kube-api-access-sxwp2") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "kube-api-access-sxwp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.693155 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.744254 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.750535 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-config-data" (OuterVolumeSpecName: "config-data") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.773033 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.773067 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.773077 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.773090 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxwp2\" (UniqueName: \"kubernetes.io/projected/0f8485ae-b380-4555-8f4a-a71544094774-kube-api-access-sxwp2\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.773098 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.847093 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pghj7" event={"ID":"46c8e962-9007-49e1-bd9f-d822e9100291","Type":"ContainerStarted","Data":"996ee60e7ca18498ccd88f148bbd61209a4f5cc8500909482eda8135e24f983e"} Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.849983 4883 generic.go:334] "Generic (PLEG): container finished" podID="0f8485ae-b380-4555-8f4a-a71544094774" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" exitCode=0 Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850010 4883 generic.go:334] "Generic (PLEG): container finished" podID="0f8485ae-b380-4555-8f4a-a71544094774" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" exitCode=2 Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850018 4883 generic.go:334] "Generic (PLEG): container finished" podID="0f8485ae-b380-4555-8f4a-a71544094774" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" exitCode=0 Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850026 4883 generic.go:334] "Generic (PLEG): container finished" podID="0f8485ae-b380-4555-8f4a-a71544094774" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" exitCode=0 Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850028 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerDied","Data":"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9"} Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850057 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850103 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerDied","Data":"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578"} Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850117 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerDied","Data":"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3"} Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850126 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerDied","Data":"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279"} Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850136 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerDied","Data":"b4032a0a620f17cd3f9ee7e418f4d03002b4d54e275f86e2be2b1e53a4c27ebe"} Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850152 4883 scope.go:117] "RemoveContainer" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.869914 4883 scope.go:117] "RemoveContainer" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.874689 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-pghj7" podStartSLOduration=1.588737966 podStartE2EDuration="9.874674135s" podCreationTimestamp="2026-03-10 09:24:06 +0000 UTC" firstStartedPulling="2026-03-10 09:24:07.015086464 +0000 UTC m=+1233.269984354" lastFinishedPulling="2026-03-10 09:24:15.301022634 +0000 UTC m=+1241.555920523" observedRunningTime="2026-03-10 09:24:15.867834437 +0000 UTC m=+1242.122732327" watchObservedRunningTime="2026-03-10 09:24:15.874674135 +0000 UTC m=+1242.129572024" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.887951 4883 scope.go:117] "RemoveContainer" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.897731 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.908111 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.912185 4883 scope.go:117] "RemoveContainer" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.917695 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.918145 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="sg-core" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918163 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="sg-core" Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.918172 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-central-agent" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918178 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-central-agent" Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.918195 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-notification-agent" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918201 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-notification-agent" Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.918232 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="proxy-httpd" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918237 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="proxy-httpd" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918402 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="proxy-httpd" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918417 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="sg-core" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918425 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-central-agent" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918440 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-notification-agent" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.925453 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.932207 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.932372 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.949577 4883 scope.go:117] "RemoveContainer" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.954031 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": container with ID starting with fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9 not found: ID does not exist" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.954081 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9"} err="failed to get container status \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": rpc error: code = NotFound desc = could not find container \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": container with ID starting with fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.954115 4883 scope.go:117] "RemoveContainer" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.958382 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": container with ID starting with d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578 not found: ID does not exist" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.958646 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578"} err="failed to get container status \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": rpc error: code = NotFound desc = could not find container \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": container with ID starting with d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.958716 4883 scope.go:117] "RemoveContainer" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.958394 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.959133 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": container with ID starting with 766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3 not found: ID does not exist" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.959190 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3"} err="failed to get container status \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": rpc error: code = NotFound desc = could not find container \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": container with ID starting with 766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.959222 4883 scope.go:117] "RemoveContainer" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.959782 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": container with ID starting with 0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279 not found: ID does not exist" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.959814 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279"} err="failed to get container status \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": rpc error: code = NotFound desc = could not find container \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": container with ID starting with 0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.959831 4883 scope.go:117] "RemoveContainer" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.960357 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9"} err="failed to get container status \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": rpc error: code = NotFound desc = could not find container \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": container with ID starting with fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.960404 4883 scope.go:117] "RemoveContainer" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.960681 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578"} err="failed to get container status \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": rpc error: code = NotFound desc = could not find container \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": container with ID starting with d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.960715 4883 scope.go:117] "RemoveContainer" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.964418 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3"} err="failed to get container status \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": rpc error: code = NotFound desc = could not find container \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": container with ID starting with 766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.964448 4883 scope.go:117] "RemoveContainer" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.964998 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279"} err="failed to get container status \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": rpc error: code = NotFound desc = could not find container \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": container with ID starting with 0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.965048 4883 scope.go:117] "RemoveContainer" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.965410 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9"} err="failed to get container status \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": rpc error: code = NotFound desc = could not find container \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": container with ID starting with fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.965442 4883 scope.go:117] "RemoveContainer" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.965726 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578"} err="failed to get container status \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": rpc error: code = NotFound desc = could not find container \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": container with ID starting with d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.965745 4883 scope.go:117] "RemoveContainer" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966012 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3"} err="failed to get container status \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": rpc error: code = NotFound desc = could not find container \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": container with ID starting with 766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966031 4883 scope.go:117] "RemoveContainer" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966240 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279"} err="failed to get container status \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": rpc error: code = NotFound desc = could not find container \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": container with ID starting with 0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966258 4883 scope.go:117] "RemoveContainer" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966486 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9"} err="failed to get container status \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": rpc error: code = NotFound desc = could not find container \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": container with ID starting with fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966507 4883 scope.go:117] "RemoveContainer" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966688 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578"} err="failed to get container status \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": rpc error: code = NotFound desc = could not find container \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": container with ID starting with d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966707 4883 scope.go:117] "RemoveContainer" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966876 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3"} err="failed to get container status \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": rpc error: code = NotFound desc = could not find container \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": container with ID starting with 766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966895 4883 scope.go:117] "RemoveContainer" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.967057 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279"} err="failed to get container status \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": rpc error: code = NotFound desc = could not find container \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": container with ID starting with 0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279 not found: ID does not exist" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079281 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-run-httpd\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079329 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-config-data\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079369 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079456 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-log-httpd\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079509 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079634 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-scripts\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079734 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4hdp\" (UniqueName: \"kubernetes.io/projected/71a316ea-2390-4458-aacb-c7b7b52030dd-kube-api-access-c4hdp\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.091515 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f8485ae-b380-4555-8f4a-a71544094774" path="/var/lib/kubelet/pods/0f8485ae-b380-4555-8f4a-a71544094774/volumes" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182380 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-run-httpd\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182427 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-config-data\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182500 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182635 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-log-httpd\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182672 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182717 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-scripts\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182761 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4hdp\" (UniqueName: \"kubernetes.io/projected/71a316ea-2390-4458-aacb-c7b7b52030dd-kube-api-access-c4hdp\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.183789 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-run-httpd\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.184269 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-log-httpd\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.189372 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-scripts\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.189828 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.190376 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.190590 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-config-data\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.202049 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4hdp\" (UniqueName: \"kubernetes.io/projected/71a316ea-2390-4458-aacb-c7b7b52030dd-kube-api-access-c4hdp\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.241789 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.524836 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.670990 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.678407 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.862309 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerStarted","Data":"bee0f47d7b61a9dac31a1270d716c8980786df32f0465319473f68681f0f03f9"} Mar 10 09:24:17 crc kubenswrapper[4883]: I0310 09:24:17.888423 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerStarted","Data":"cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6"} Mar 10 09:24:18 crc kubenswrapper[4883]: I0310 09:24:18.898435 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerStarted","Data":"c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0"} Mar 10 09:24:18 crc kubenswrapper[4883]: I0310 09:24:18.898909 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerStarted","Data":"c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9"} Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.942549 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerStarted","Data":"8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a"} Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.943155 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.942916 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="proxy-httpd" containerID="cri-o://8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a" gracePeriod=30 Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.942684 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-central-agent" containerID="cri-o://cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6" gracePeriod=30 Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.942928 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="sg-core" containerID="cri-o://c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0" gracePeriod=30 Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.942940 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-notification-agent" containerID="cri-o://c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9" gracePeriod=30 Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.945596 4883 generic.go:334] "Generic (PLEG): container finished" podID="46c8e962-9007-49e1-bd9f-d822e9100291" containerID="996ee60e7ca18498ccd88f148bbd61209a4f5cc8500909482eda8135e24f983e" exitCode=0 Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.945652 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pghj7" event={"ID":"46c8e962-9007-49e1-bd9f-d822e9100291","Type":"ContainerDied","Data":"996ee60e7ca18498ccd88f148bbd61209a4f5cc8500909482eda8135e24f983e"} Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.964369 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.437312297 podStartE2EDuration="6.964342091s" podCreationTimestamp="2026-03-10 09:24:15 +0000 UTC" firstStartedPulling="2026-03-10 09:24:16.677976729 +0000 UTC m=+1242.932874618" lastFinishedPulling="2026-03-10 09:24:21.205006523 +0000 UTC m=+1247.459904412" observedRunningTime="2026-03-10 09:24:21.957582765 +0000 UTC m=+1248.212480654" watchObservedRunningTime="2026-03-10 09:24:21.964342091 +0000 UTC m=+1248.219239980" Mar 10 09:24:22 crc kubenswrapper[4883]: I0310 09:24:22.957104 4883 generic.go:334] "Generic (PLEG): container finished" podID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerID="8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a" exitCode=0 Mar 10 09:24:22 crc kubenswrapper[4883]: I0310 09:24:22.957873 4883 generic.go:334] "Generic (PLEG): container finished" podID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerID="c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0" exitCode=2 Mar 10 09:24:22 crc kubenswrapper[4883]: I0310 09:24:22.957962 4883 generic.go:334] "Generic (PLEG): container finished" podID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerID="c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9" exitCode=0 Mar 10 09:24:22 crc kubenswrapper[4883]: I0310 09:24:22.957179 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerDied","Data":"8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a"} Mar 10 09:24:22 crc kubenswrapper[4883]: I0310 09:24:22.958074 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerDied","Data":"c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0"} Mar 10 09:24:22 crc kubenswrapper[4883]: I0310 09:24:22.958095 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerDied","Data":"c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9"} Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.240670 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.436078 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-config-data\") pod \"46c8e962-9007-49e1-bd9f-d822e9100291\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.436229 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nmp8\" (UniqueName: \"kubernetes.io/projected/46c8e962-9007-49e1-bd9f-d822e9100291-kube-api-access-4nmp8\") pod \"46c8e962-9007-49e1-bd9f-d822e9100291\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.436384 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-scripts\") pod \"46c8e962-9007-49e1-bd9f-d822e9100291\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.436450 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-combined-ca-bundle\") pod \"46c8e962-9007-49e1-bd9f-d822e9100291\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.442599 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-scripts" (OuterVolumeSpecName: "scripts") pod "46c8e962-9007-49e1-bd9f-d822e9100291" (UID: "46c8e962-9007-49e1-bd9f-d822e9100291"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.443803 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c8e962-9007-49e1-bd9f-d822e9100291-kube-api-access-4nmp8" (OuterVolumeSpecName: "kube-api-access-4nmp8") pod "46c8e962-9007-49e1-bd9f-d822e9100291" (UID: "46c8e962-9007-49e1-bd9f-d822e9100291"). InnerVolumeSpecName "kube-api-access-4nmp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.479557 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-config-data" (OuterVolumeSpecName: "config-data") pod "46c8e962-9007-49e1-bd9f-d822e9100291" (UID: "46c8e962-9007-49e1-bd9f-d822e9100291"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.484664 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46c8e962-9007-49e1-bd9f-d822e9100291" (UID: "46c8e962-9007-49e1-bd9f-d822e9100291"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.540151 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.540257 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nmp8\" (UniqueName: \"kubernetes.io/projected/46c8e962-9007-49e1-bd9f-d822e9100291-kube-api-access-4nmp8\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.540358 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.540373 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.969514 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pghj7" event={"ID":"46c8e962-9007-49e1-bd9f-d822e9100291","Type":"ContainerDied","Data":"476b4874dc1fbb37d06665527cae7ca6640ff7b5cda30363d5e889fac818a771"} Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.970553 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="476b4874dc1fbb37d06665527cae7ca6640ff7b5cda30363d5e889fac818a771" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.969807 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.104834 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 09:24:24 crc kubenswrapper[4883]: E0310 09:24:24.105173 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c8e962-9007-49e1-bd9f-d822e9100291" containerName="nova-cell0-conductor-db-sync" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.105190 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c8e962-9007-49e1-bd9f-d822e9100291" containerName="nova-cell0-conductor-db-sync" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.105372 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c8e962-9007-49e1-bd9f-d822e9100291" containerName="nova-cell0-conductor-db-sync" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.106160 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.107804 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.109314 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hmfjf" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.121927 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.254251 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9wfk\" (UniqueName: \"kubernetes.io/projected/19096ebe-3796-4e22-a477-45d3e635a80a-kube-api-access-c9wfk\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.254352 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19096ebe-3796-4e22-a477-45d3e635a80a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.254423 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19096ebe-3796-4e22-a477-45d3e635a80a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.356729 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19096ebe-3796-4e22-a477-45d3e635a80a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.356809 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19096ebe-3796-4e22-a477-45d3e635a80a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.356922 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9wfk\" (UniqueName: \"kubernetes.io/projected/19096ebe-3796-4e22-a477-45d3e635a80a-kube-api-access-c9wfk\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.362034 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19096ebe-3796-4e22-a477-45d3e635a80a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.362135 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19096ebe-3796-4e22-a477-45d3e635a80a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.372182 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9wfk\" (UniqueName: \"kubernetes.io/projected/19096ebe-3796-4e22-a477-45d3e635a80a-kube-api-access-c9wfk\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.404785 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.419959 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.559976 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4hdp\" (UniqueName: \"kubernetes.io/projected/71a316ea-2390-4458-aacb-c7b7b52030dd-kube-api-access-c4hdp\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.560090 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-scripts\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.560251 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-sg-core-conf-yaml\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.560302 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-config-data\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.560558 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-run-httpd\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.560604 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-log-httpd\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.560677 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-combined-ca-bundle\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.561159 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.561627 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.562333 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.562366 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.564669 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-scripts" (OuterVolumeSpecName: "scripts") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.577874 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a316ea-2390-4458-aacb-c7b7b52030dd-kube-api-access-c4hdp" (OuterVolumeSpecName: "kube-api-access-c4hdp") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "kube-api-access-c4hdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.589698 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.629716 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.634910 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-config-data" (OuterVolumeSpecName: "config-data") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.665030 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.665055 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4hdp\" (UniqueName: \"kubernetes.io/projected/71a316ea-2390-4458-aacb-c7b7b52030dd-kube-api-access-c4hdp\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.665068 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.665080 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.665089 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: W0310 09:24:24.838023 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19096ebe_3796_4e22_a477_45d3e635a80a.slice/crio-75524614d6b68118e286c110d9aa73f78a9be830790220f146be33f8f77b6d0d WatchSource:0}: Error finding container 75524614d6b68118e286c110d9aa73f78a9be830790220f146be33f8f77b6d0d: Status 404 returned error can't find the container with id 75524614d6b68118e286c110d9aa73f78a9be830790220f146be33f8f77b6d0d Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.839035 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.983760 4883 generic.go:334] "Generic (PLEG): container finished" podID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerID="cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6" exitCode=0 Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.983864 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.983885 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerDied","Data":"cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6"} Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.983960 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerDied","Data":"bee0f47d7b61a9dac31a1270d716c8980786df32f0465319473f68681f0f03f9"} Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.983988 4883 scope.go:117] "RemoveContainer" containerID="8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.986329 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"19096ebe-3796-4e22-a477-45d3e635a80a","Type":"ContainerStarted","Data":"75524614d6b68118e286c110d9aa73f78a9be830790220f146be33f8f77b6d0d"} Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.008088 4883 scope.go:117] "RemoveContainer" containerID="c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.019012 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.026402 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.033499 4883 scope.go:117] "RemoveContainer" containerID="c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.035792 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.036136 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="proxy-httpd" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036149 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="proxy-httpd" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.036165 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="sg-core" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036171 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="sg-core" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.036182 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-notification-agent" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036188 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-notification-agent" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.036208 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-central-agent" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036213 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-central-agent" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036467 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="sg-core" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036498 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-notification-agent" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036505 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="proxy-httpd" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036518 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-central-agent" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.038392 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.039762 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.040821 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.049308 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.074630 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-run-httpd\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.074763 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-scripts\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.074842 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpz25\" (UniqueName: \"kubernetes.io/projected/13fc6b71-b633-4726-ad0d-91a04b592d3b-kube-api-access-bpz25\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.075099 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.075131 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-config-data\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.075187 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-log-httpd\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.075221 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.077941 4883 scope.go:117] "RemoveContainer" containerID="cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.096076 4883 scope.go:117] "RemoveContainer" containerID="8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.096382 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a\": container with ID starting with 8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a not found: ID does not exist" containerID="8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.096412 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a"} err="failed to get container status \"8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a\": rpc error: code = NotFound desc = could not find container \"8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a\": container with ID starting with 8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a not found: ID does not exist" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.096433 4883 scope.go:117] "RemoveContainer" containerID="c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.096748 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0\": container with ID starting with c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0 not found: ID does not exist" containerID="c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.096773 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0"} err="failed to get container status \"c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0\": rpc error: code = NotFound desc = could not find container \"c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0\": container with ID starting with c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0 not found: ID does not exist" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.096788 4883 scope.go:117] "RemoveContainer" containerID="c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.097115 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9\": container with ID starting with c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9 not found: ID does not exist" containerID="c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.097136 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9"} err="failed to get container status \"c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9\": rpc error: code = NotFound desc = could not find container \"c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9\": container with ID starting with c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9 not found: ID does not exist" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.097150 4883 scope.go:117] "RemoveContainer" containerID="cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.097335 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6\": container with ID starting with cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6 not found: ID does not exist" containerID="cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.097354 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6"} err="failed to get container status \"cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6\": rpc error: code = NotFound desc = could not find container \"cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6\": container with ID starting with cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6 not found: ID does not exist" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.177895 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-run-httpd\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.177958 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-scripts\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.177990 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpz25\" (UniqueName: \"kubernetes.io/projected/13fc6b71-b633-4726-ad0d-91a04b592d3b-kube-api-access-bpz25\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.178073 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.178097 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-config-data\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.178120 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-log-httpd\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.178147 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.178399 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-run-httpd\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.184335 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.184589 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-log-httpd\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.185237 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-scripts\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.185269 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-config-data\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.185335 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.193750 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpz25\" (UniqueName: \"kubernetes.io/projected/13fc6b71-b633-4726-ad0d-91a04b592d3b-kube-api-access-bpz25\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.360651 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.757497 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:25 crc kubenswrapper[4883]: W0310 09:24:25.771238 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13fc6b71_b633_4726_ad0d_91a04b592d3b.slice/crio-e06324548e387289ee5784fe269a4d67cc71d75aeed67c453f124447b142b7b8 WatchSource:0}: Error finding container e06324548e387289ee5784fe269a4d67cc71d75aeed67c453f124447b142b7b8: Status 404 returned error can't find the container with id e06324548e387289ee5784fe269a4d67cc71d75aeed67c453f124447b142b7b8 Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.998785 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerStarted","Data":"e06324548e387289ee5784fe269a4d67cc71d75aeed67c453f124447b142b7b8"} Mar 10 09:24:26 crc kubenswrapper[4883]: I0310 09:24:26.002404 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"19096ebe-3796-4e22-a477-45d3e635a80a","Type":"ContainerStarted","Data":"c801de5a1fe88d8ceae751b1030db246ddaff2437d3b3776ac62681445fa6afb"} Mar 10 09:24:26 crc kubenswrapper[4883]: I0310 09:24:26.002636 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:26 crc kubenswrapper[4883]: I0310 09:24:26.091725 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" path="/var/lib/kubelet/pods/71a316ea-2390-4458-aacb-c7b7b52030dd/volumes" Mar 10 09:24:27 crc kubenswrapper[4883]: I0310 09:24:27.009983 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerStarted","Data":"1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e"} Mar 10 09:24:28 crc kubenswrapper[4883]: I0310 09:24:28.023179 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerStarted","Data":"3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96"} Mar 10 09:24:28 crc kubenswrapper[4883]: I0310 09:24:28.023570 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerStarted","Data":"aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830"} Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.451824 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.471898 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=5.471882313 podStartE2EDuration="5.471882313s" podCreationTimestamp="2026-03-10 09:24:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:26.024246495 +0000 UTC m=+1252.279144405" watchObservedRunningTime="2026-03-10 09:24:29.471882313 +0000 UTC m=+1255.726780202" Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.889554 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-rptkb"] Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.891062 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.894684 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.896248 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.900429 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rptkb"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.068334 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.069752 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.071964 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.086563 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbkj2\" (UniqueName: \"kubernetes.io/projected/3d3a7934-1ab2-4013-b3ff-90859ffcc179-kube-api-access-hbkj2\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.086597 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.086653 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-scripts\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.086684 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-config-data\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.106789 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.107776 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.110784 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.117541 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.150528 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189402 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbkj2\" (UniqueName: \"kubernetes.io/projected/3d3a7934-1ab2-4013-b3ff-90859ffcc179-kube-api-access-hbkj2\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189440 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189511 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-scripts\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189543 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-config-data\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189581 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mslfz\" (UniqueName: \"kubernetes.io/projected/b36d321f-f1b6-425e-abdf-61478d9ccf1a-kube-api-access-mslfz\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189643 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189669 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.195533 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.210906 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-scripts\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.226974 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-config-data\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.238683 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.240990 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbkj2\" (UniqueName: \"kubernetes.io/projected/3d3a7934-1ab2-4013-b3ff-90859ffcc179-kube-api-access-hbkj2\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.248449 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.254836 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.272932 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.292509 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-config-data\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.292633 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mslfz\" (UniqueName: \"kubernetes.io/projected/b36d321f-f1b6-425e-abdf-61478d9ccf1a-kube-api-access-mslfz\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.292700 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.292732 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.292751 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcsm2\" (UniqueName: \"kubernetes.io/projected/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-kube-api-access-qcsm2\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.292768 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.299707 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.300908 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.324090 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mslfz\" (UniqueName: \"kubernetes.io/projected/b36d321f-f1b6-425e-abdf-61478d9ccf1a-kube-api-access-mslfz\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.341520 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.343130 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.347153 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.407044 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408268 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcsm2\" (UniqueName: \"kubernetes.io/projected/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-kube-api-access-qcsm2\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408328 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408433 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47cb531f-d85b-41f0-9608-a19b158679c7-logs\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408493 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-config-data\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408530 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-config-data\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408612 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408683 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg92q\" (UniqueName: \"kubernetes.io/projected/47cb531f-d85b-41f0-9608-a19b158679c7-kube-api-access-dg92q\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.412011 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.427517 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-config-data\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.433548 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.449359 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcsm2\" (UniqueName: \"kubernetes.io/projected/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-kube-api-access-qcsm2\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.502525 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-sqzrs"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.504217 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510366 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47cb531f-d85b-41f0-9608-a19b158679c7-logs\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510435 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-config-data\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510503 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510539 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-logs\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510579 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg92q\" (UniqueName: \"kubernetes.io/projected/47cb531f-d85b-41f0-9608-a19b158679c7-kube-api-access-dg92q\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510605 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510703 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-config-data\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510727 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kldfn\" (UniqueName: \"kubernetes.io/projected/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-kube-api-access-kldfn\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.511133 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47cb531f-d85b-41f0-9608-a19b158679c7-logs\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.519572 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.521434 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.535047 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-config-data\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.541994 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg92q\" (UniqueName: \"kubernetes.io/projected/47cb531f-d85b-41f0-9608-a19b158679c7-kube-api-access-dg92q\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.554737 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-sqzrs"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.595673 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617230 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-config-data\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617304 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kldfn\" (UniqueName: \"kubernetes.io/projected/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-kube-api-access-kldfn\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617378 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617424 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrr79\" (UniqueName: \"kubernetes.io/projected/abc326c8-0db0-4645-b1dc-3871b1b4202c-kube-api-access-hrr79\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617508 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617585 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-logs\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617622 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617735 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-config\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617778 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.620289 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-logs\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.622852 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.623903 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-config-data\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.638342 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kldfn\" (UniqueName: \"kubernetes.io/projected/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-kube-api-access-kldfn\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.664288 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.720069 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.720410 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.720518 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-config\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.720581 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.720693 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.721591 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-config\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.722207 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.723007 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.723411 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.723959 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.724649 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.724733 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrr79\" (UniqueName: \"kubernetes.io/projected/abc326c8-0db0-4645-b1dc-3871b1b4202c-kube-api-access-hrr79\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.743809 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrr79\" (UniqueName: \"kubernetes.io/projected/abc326c8-0db0-4645-b1dc-3871b1b4202c-kube-api-access-hrr79\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.848553 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.025008 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.120122 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rptkb"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.136060 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerStarted","Data":"ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb"} Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.136224 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.179716 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.214537578 podStartE2EDuration="6.179696798s" podCreationTimestamp="2026-03-10 09:24:25 +0000 UTC" firstStartedPulling="2026-03-10 09:24:25.775633696 +0000 UTC m=+1252.030531586" lastFinishedPulling="2026-03-10 09:24:29.740792917 +0000 UTC m=+1255.995690806" observedRunningTime="2026-03-10 09:24:31.162189826 +0000 UTC m=+1257.417087715" watchObservedRunningTime="2026-03-10 09:24:31.179696798 +0000 UTC m=+1257.434594688" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.196460 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s66f5"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.198848 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.200554 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.200977 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.219666 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s66f5"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.248794 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.248893 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tvcd\" (UniqueName: \"kubernetes.io/projected/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-kube-api-access-5tvcd\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.248957 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-scripts\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.249102 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-config-data\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: W0310 09:24:31.258991 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47cb531f_d85b_41f0_9608_a19b158679c7.slice/crio-267bcf024c91a240e0cf8abd61f08e6de1d6f0aad3c2fdd15bea1caa97a544cf WatchSource:0}: Error finding container 267bcf024c91a240e0cf8abd61f08e6de1d6f0aad3c2fdd15bea1caa97a544cf: Status 404 returned error can't find the container with id 267bcf024c91a240e0cf8abd61f08e6de1d6f0aad3c2fdd15bea1caa97a544cf Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.259651 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.350873 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.351591 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tvcd\" (UniqueName: \"kubernetes.io/projected/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-kube-api-access-5tvcd\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.351856 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-scripts\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.352021 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-config-data\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.356602 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-config-data\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.357417 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.363117 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-scripts\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.367531 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tvcd\" (UniqueName: \"kubernetes.io/projected/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-kube-api-access-5tvcd\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.489816 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.528829 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: W0310 09:24:31.610878 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e8e02c2_3c36_440a_b7aa_d39b27f3bd32.slice/crio-3df63ff82b84df0c9c734c9449801557a1a4dedf62a22952150b25c471cef32b WatchSource:0}: Error finding container 3df63ff82b84df0c9c734c9449801557a1a4dedf62a22952150b25c471cef32b: Status 404 returned error can't find the container with id 3df63ff82b84df0c9c734c9449801557a1a4dedf62a22952150b25c471cef32b Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.612952 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.628968 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-sqzrs"] Mar 10 09:24:31 crc kubenswrapper[4883]: W0310 09:24:31.642005 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabc326c8_0db0_4645_b1dc_3871b1b4202c.slice/crio-48534b3ef2da468c813fcefb6a12bb7f5e9a292a4eb15ad03c6e477d12cc26e8 WatchSource:0}: Error finding container 48534b3ef2da468c813fcefb6a12bb7f5e9a292a4eb15ad03c6e477d12cc26e8: Status 404 returned error can't find the container with id 48534b3ef2da468c813fcefb6a12bb7f5e9a292a4eb15ad03c6e477d12cc26e8 Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.967975 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s66f5"] Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.149523 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32","Type":"ContainerStarted","Data":"3df63ff82b84df0c9c734c9449801557a1a4dedf62a22952150b25c471cef32b"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.152295 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rptkb" event={"ID":"3d3a7934-1ab2-4013-b3ff-90859ffcc179","Type":"ContainerStarted","Data":"1b941b684a31273444bcbcfa6618c7c4de8354efb411f75cdb5b7cfd833281c3"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.152356 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rptkb" event={"ID":"3d3a7934-1ab2-4013-b3ff-90859ffcc179","Type":"ContainerStarted","Data":"93bf15898cd55775fe1c641481b3b6079857f825b2b5abf0ebd10208dc8b1155"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.159529 4883 generic.go:334] "Generic (PLEG): container finished" podID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerID="419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8" exitCode=0 Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.159642 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" event={"ID":"abc326c8-0db0-4645-b1dc-3871b1b4202c","Type":"ContainerDied","Data":"419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.159679 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" event={"ID":"abc326c8-0db0-4645-b1dc-3871b1b4202c","Type":"ContainerStarted","Data":"48534b3ef2da468c813fcefb6a12bb7f5e9a292a4eb15ad03c6e477d12cc26e8"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.166169 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s66f5" event={"ID":"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05","Type":"ContainerStarted","Data":"4666671669cee93f50494e5681010f34c95c13f1664290315b5bce6c7f0d081c"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.167308 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b36d321f-f1b6-425e-abdf-61478d9ccf1a","Type":"ContainerStarted","Data":"b5a77ffb6de872f70305d30c9087d34b77d39ad804e7a809e3b0b8a5a62a2dd7"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.172267 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d18d7e6b-feee-4222-a8fd-c13c0c70db2a","Type":"ContainerStarted","Data":"784b466f36c208110a836e6f45ef305de71d86ec8154db38b7751f2d884445c5"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.177095 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47cb531f-d85b-41f0-9608-a19b158679c7","Type":"ContainerStarted","Data":"267bcf024c91a240e0cf8abd61f08e6de1d6f0aad3c2fdd15bea1caa97a544cf"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.178749 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-rptkb" podStartSLOduration=3.17873298 podStartE2EDuration="3.17873298s" podCreationTimestamp="2026-03-10 09:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:32.176931093 +0000 UTC m=+1258.431828981" watchObservedRunningTime="2026-03-10 09:24:32.17873298 +0000 UTC m=+1258.433630869" Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.194419 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-s66f5" podStartSLOduration=1.1943984300000001 podStartE2EDuration="1.19439843s" podCreationTimestamp="2026-03-10 09:24:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:32.193892335 +0000 UTC m=+1258.448790225" watchObservedRunningTime="2026-03-10 09:24:32.19439843 +0000 UTC m=+1258.449296318" Mar 10 09:24:33 crc kubenswrapper[4883]: I0310 09:24:33.192794 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" event={"ID":"abc326c8-0db0-4645-b1dc-3871b1b4202c","Type":"ContainerStarted","Data":"8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867"} Mar 10 09:24:33 crc kubenswrapper[4883]: I0310 09:24:33.193293 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:33 crc kubenswrapper[4883]: I0310 09:24:33.198095 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s66f5" event={"ID":"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05","Type":"ContainerStarted","Data":"9cbbbfbaa746758ec604b741e62dd7f69a1ec3688b2c70a13290cb0a5d5252e8"} Mar 10 09:24:33 crc kubenswrapper[4883]: I0310 09:24:33.677097 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" podStartSLOduration=3.677078519 podStartE2EDuration="3.677078519s" podCreationTimestamp="2026-03-10 09:24:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:33.21520347 +0000 UTC m=+1259.470101359" watchObservedRunningTime="2026-03-10 09:24:33.677078519 +0000 UTC m=+1259.931976408" Mar 10 09:24:33 crc kubenswrapper[4883]: I0310 09:24:33.679645 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:33 crc kubenswrapper[4883]: I0310 09:24:33.690650 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:34 crc kubenswrapper[4883]: I0310 09:24:34.210365 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32","Type":"ContainerStarted","Data":"fda8a7af1c52f69233e2ff5b9090007030a88af5530808dd1c1ae5e86ac05d25"} Mar 10 09:24:34 crc kubenswrapper[4883]: I0310 09:24:34.213801 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b36d321f-f1b6-425e-abdf-61478d9ccf1a","Type":"ContainerStarted","Data":"27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd"} Mar 10 09:24:34 crc kubenswrapper[4883]: I0310 09:24:34.236106 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.504893738 podStartE2EDuration="4.236084343s" podCreationTimestamp="2026-03-10 09:24:30 +0000 UTC" firstStartedPulling="2026-03-10 09:24:31.614137275 +0000 UTC m=+1257.869035164" lastFinishedPulling="2026-03-10 09:24:33.34532788 +0000 UTC m=+1259.600225769" observedRunningTime="2026-03-10 09:24:34.223676216 +0000 UTC m=+1260.478574095" watchObservedRunningTime="2026-03-10 09:24:34.236084343 +0000 UTC m=+1260.490982232" Mar 10 09:24:34 crc kubenswrapper[4883]: I0310 09:24:34.260383 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.35230112 podStartE2EDuration="4.260367824s" podCreationTimestamp="2026-03-10 09:24:30 +0000 UTC" firstStartedPulling="2026-03-10 09:24:31.087343182 +0000 UTC m=+1257.342241070" lastFinishedPulling="2026-03-10 09:24:32.995409884 +0000 UTC m=+1259.250307774" observedRunningTime="2026-03-10 09:24:34.240909562 +0000 UTC m=+1260.495807451" watchObservedRunningTime="2026-03-10 09:24:34.260367824 +0000 UTC m=+1260.515265712" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.223044 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b36d321f-f1b6-425e-abdf-61478d9ccf1a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd" gracePeriod=30 Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.408192 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.724341 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.843520 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.895912 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-config-data\") pod \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.896018 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-combined-ca-bundle\") pod \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.896170 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mslfz\" (UniqueName: \"kubernetes.io/projected/b36d321f-f1b6-425e-abdf-61478d9ccf1a-kube-api-access-mslfz\") pod \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.899516 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b36d321f-f1b6-425e-abdf-61478d9ccf1a-kube-api-access-mslfz" (OuterVolumeSpecName: "kube-api-access-mslfz") pod "b36d321f-f1b6-425e-abdf-61478d9ccf1a" (UID: "b36d321f-f1b6-425e-abdf-61478d9ccf1a"). InnerVolumeSpecName "kube-api-access-mslfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.916438 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-config-data" (OuterVolumeSpecName: "config-data") pod "b36d321f-f1b6-425e-abdf-61478d9ccf1a" (UID: "b36d321f-f1b6-425e-abdf-61478d9ccf1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.917454 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b36d321f-f1b6-425e-abdf-61478d9ccf1a" (UID: "b36d321f-f1b6-425e-abdf-61478d9ccf1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.998313 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.998352 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.998368 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mslfz\" (UniqueName: \"kubernetes.io/projected/b36d321f-f1b6-425e-abdf-61478d9ccf1a-kube-api-access-mslfz\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.236057 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47cb531f-d85b-41f0-9608-a19b158679c7","Type":"ContainerStarted","Data":"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.238248 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47cb531f-d85b-41f0-9608-a19b158679c7","Type":"ContainerStarted","Data":"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.239663 4883 generic.go:334] "Generic (PLEG): container finished" podID="ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" containerID="9cbbbfbaa746758ec604b741e62dd7f69a1ec3688b2c70a13290cb0a5d5252e8" exitCode=0 Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.239824 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s66f5" event={"ID":"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05","Type":"ContainerDied","Data":"9cbbbfbaa746758ec604b741e62dd7f69a1ec3688b2c70a13290cb0a5d5252e8"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.243692 4883 generic.go:334] "Generic (PLEG): container finished" podID="b36d321f-f1b6-425e-abdf-61478d9ccf1a" containerID="27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd" exitCode=0 Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.243807 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b36d321f-f1b6-425e-abdf-61478d9ccf1a","Type":"ContainerDied","Data":"27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.243840 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b36d321f-f1b6-425e-abdf-61478d9ccf1a","Type":"ContainerDied","Data":"b5a77ffb6de872f70305d30c9087d34b77d39ad804e7a809e3b0b8a5a62a2dd7"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.243862 4883 scope.go:117] "RemoveContainer" containerID="27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.244136 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.248672 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-log" containerID="cri-o://5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457" gracePeriod=30 Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.248987 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d18d7e6b-feee-4222-a8fd-c13c0c70db2a","Type":"ContainerStarted","Data":"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.251008 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d18d7e6b-feee-4222-a8fd-c13c0c70db2a","Type":"ContainerStarted","Data":"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.249184 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-metadata" containerID="cri-o://ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8" gracePeriod=30 Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.259494 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.198673662 podStartE2EDuration="6.259459359s" podCreationTimestamp="2026-03-10 09:24:30 +0000 UTC" firstStartedPulling="2026-03-10 09:24:31.273172051 +0000 UTC m=+1257.528069940" lastFinishedPulling="2026-03-10 09:24:35.333957748 +0000 UTC m=+1261.588855637" observedRunningTime="2026-03-10 09:24:36.254461915 +0000 UTC m=+1262.509359804" watchObservedRunningTime="2026-03-10 09:24:36.259459359 +0000 UTC m=+1262.514357248" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.273241 4883 scope.go:117] "RemoveContainer" containerID="27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd" Mar 10 09:24:36 crc kubenswrapper[4883]: E0310 09:24:36.273673 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd\": container with ID starting with 27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd not found: ID does not exist" containerID="27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.273716 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd"} err="failed to get container status \"27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd\": rpc error: code = NotFound desc = could not find container \"27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd\": container with ID starting with 27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd not found: ID does not exist" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.278245 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.289672 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.314774 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:36 crc kubenswrapper[4883]: E0310 09:24:36.315308 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36d321f-f1b6-425e-abdf-61478d9ccf1a" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.315330 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36d321f-f1b6-425e-abdf-61478d9ccf1a" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.315593 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36d321f-f1b6-425e-abdf-61478d9ccf1a" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.316329 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.317987 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.48274827 podStartE2EDuration="6.317968367s" podCreationTimestamp="2026-03-10 09:24:30 +0000 UTC" firstStartedPulling="2026-03-10 09:24:31.492320711 +0000 UTC m=+1257.747218600" lastFinishedPulling="2026-03-10 09:24:35.327540818 +0000 UTC m=+1261.582438697" observedRunningTime="2026-03-10 09:24:36.293899601 +0000 UTC m=+1262.548797490" watchObservedRunningTime="2026-03-10 09:24:36.317968367 +0000 UTC m=+1262.572866257" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.318748 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.318838 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.318764 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.325351 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.405907 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.405978 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br5kz\" (UniqueName: \"kubernetes.io/projected/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-kube-api-access-br5kz\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.406027 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.406193 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.406230 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.507288 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.507603 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br5kz\" (UniqueName: \"kubernetes.io/projected/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-kube-api-access-br5kz\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.507653 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.507706 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.507730 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.513373 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.518034 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.518709 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.522065 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.524706 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br5kz\" (UniqueName: \"kubernetes.io/projected/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-kube-api-access-br5kz\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.631944 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.734160 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.829557 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-logs\") pod \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.829611 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-config-data\") pod \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.829694 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kldfn\" (UniqueName: \"kubernetes.io/projected/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-kube-api-access-kldfn\") pod \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.829716 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-combined-ca-bundle\") pod \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.830152 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-logs" (OuterVolumeSpecName: "logs") pod "d18d7e6b-feee-4222-a8fd-c13c0c70db2a" (UID: "d18d7e6b-feee-4222-a8fd-c13c0c70db2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.830765 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.833546 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-kube-api-access-kldfn" (OuterVolumeSpecName: "kube-api-access-kldfn") pod "d18d7e6b-feee-4222-a8fd-c13c0c70db2a" (UID: "d18d7e6b-feee-4222-a8fd-c13c0c70db2a"). InnerVolumeSpecName "kube-api-access-kldfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.854159 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-config-data" (OuterVolumeSpecName: "config-data") pod "d18d7e6b-feee-4222-a8fd-c13c0c70db2a" (UID: "d18d7e6b-feee-4222-a8fd-c13c0c70db2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.858253 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d18d7e6b-feee-4222-a8fd-c13c0c70db2a" (UID: "d18d7e6b-feee-4222-a8fd-c13c0c70db2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.933198 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.933232 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kldfn\" (UniqueName: \"kubernetes.io/projected/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-kube-api-access-kldfn\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.933246 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.062967 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:37 crc kubenswrapper[4883]: W0310 09:24:37.063201 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c5d710c_62fb_4a8c_8a5c_ec6709017c75.slice/crio-b5748be1a30c2c0759c323ba2e951830ad159416d7d5aa62229441f92ae75abe WatchSource:0}: Error finding container b5748be1a30c2c0759c323ba2e951830ad159416d7d5aa62229441f92ae75abe: Status 404 returned error can't find the container with id b5748be1a30c2c0759c323ba2e951830ad159416d7d5aa62229441f92ae75abe Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258530 4883 generic.go:334] "Generic (PLEG): container finished" podID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerID="ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8" exitCode=0 Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258749 4883 generic.go:334] "Generic (PLEG): container finished" podID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerID="5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457" exitCode=143 Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258786 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d18d7e6b-feee-4222-a8fd-c13c0c70db2a","Type":"ContainerDied","Data":"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8"} Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258810 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d18d7e6b-feee-4222-a8fd-c13c0c70db2a","Type":"ContainerDied","Data":"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457"} Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258820 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d18d7e6b-feee-4222-a8fd-c13c0c70db2a","Type":"ContainerDied","Data":"784b466f36c208110a836e6f45ef305de71d86ec8154db38b7751f2d884445c5"} Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258834 4883 scope.go:117] "RemoveContainer" containerID="ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258915 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.264042 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2c5d710c-62fb-4a8c-8a5c-ec6709017c75","Type":"ContainerStarted","Data":"febf65f20b57000a3d173a143e3b6c0d43fb2e01df47333228b40fc340e619a9"} Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.264068 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2c5d710c-62fb-4a8c-8a5c-ec6709017c75","Type":"ContainerStarted","Data":"b5748be1a30c2c0759c323ba2e951830ad159416d7d5aa62229441f92ae75abe"} Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.286466 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.2864539910000001 podStartE2EDuration="1.286453991s" podCreationTimestamp="2026-03-10 09:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:37.28084306 +0000 UTC m=+1263.535740949" watchObservedRunningTime="2026-03-10 09:24:37.286453991 +0000 UTC m=+1263.541351879" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.306613 4883 scope.go:117] "RemoveContainer" containerID="5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.310549 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.330811 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.348604 4883 scope.go:117] "RemoveContainer" containerID="ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8" Mar 10 09:24:37 crc kubenswrapper[4883]: E0310 09:24:37.353112 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8\": container with ID starting with ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8 not found: ID does not exist" containerID="ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353151 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8"} err="failed to get container status \"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8\": rpc error: code = NotFound desc = could not find container \"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8\": container with ID starting with ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8 not found: ID does not exist" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353182 4883 scope.go:117] "RemoveContainer" containerID="5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353245 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:37 crc kubenswrapper[4883]: E0310 09:24:37.353731 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-log" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353750 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-log" Mar 10 09:24:37 crc kubenswrapper[4883]: E0310 09:24:37.353762 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-metadata" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353771 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-metadata" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353969 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-metadata" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353996 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-log" Mar 10 09:24:37 crc kubenswrapper[4883]: E0310 09:24:37.354776 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457\": container with ID starting with 5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457 not found: ID does not exist" containerID="5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.354826 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457"} err="failed to get container status \"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457\": rpc error: code = NotFound desc = could not find container \"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457\": container with ID starting with 5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457 not found: ID does not exist" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.354871 4883 scope.go:117] "RemoveContainer" containerID="ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.355060 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.355175 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8"} err="failed to get container status \"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8\": rpc error: code = NotFound desc = could not find container \"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8\": container with ID starting with ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8 not found: ID does not exist" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.355218 4883 scope.go:117] "RemoveContainer" containerID="5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.355878 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457"} err="failed to get container status \"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457\": rpc error: code = NotFound desc = could not find container \"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457\": container with ID starting with 5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457 not found: ID does not exist" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.358686 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.358936 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.363354 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.443877 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.443940 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.444012 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-config-data\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.444092 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2300d23-b25a-4e0d-a695-7c11709bfcda-logs\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.444125 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9f88\" (UniqueName: \"kubernetes.io/projected/c2300d23-b25a-4e0d-a695-7c11709bfcda-kube-api-access-f9f88\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.546170 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.546492 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.546564 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-config-data\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.546607 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2300d23-b25a-4e0d-a695-7c11709bfcda-logs\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.546654 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9f88\" (UniqueName: \"kubernetes.io/projected/c2300d23-b25a-4e0d-a695-7c11709bfcda-kube-api-access-f9f88\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.547158 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2300d23-b25a-4e0d-a695-7c11709bfcda-logs\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.552706 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.557912 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.560412 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-config-data\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.564497 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9f88\" (UniqueName: \"kubernetes.io/projected/c2300d23-b25a-4e0d-a695-7c11709bfcda-kube-api-access-f9f88\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.631915 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.768985 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tvcd\" (UniqueName: \"kubernetes.io/projected/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-kube-api-access-5tvcd\") pod \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.769391 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-combined-ca-bundle\") pod \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.769435 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-scripts\") pod \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.769703 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-config-data\") pod \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.774555 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-kube-api-access-5tvcd" (OuterVolumeSpecName: "kube-api-access-5tvcd") pod "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" (UID: "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05"). InnerVolumeSpecName "kube-api-access-5tvcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.781220 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.785102 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-scripts" (OuterVolumeSpecName: "scripts") pod "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" (UID: "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.790784 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" (UID: "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.792824 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-config-data" (OuterVolumeSpecName: "config-data") pod "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" (UID: "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.873867 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tvcd\" (UniqueName: \"kubernetes.io/projected/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-kube-api-access-5tvcd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.873904 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.873915 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.873925 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.091205 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b36d321f-f1b6-425e-abdf-61478d9ccf1a" path="/var/lib/kubelet/pods/b36d321f-f1b6-425e-abdf-61478d9ccf1a/volumes" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.092027 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" path="/var/lib/kubelet/pods/d18d7e6b-feee-4222-a8fd-c13c0c70db2a/volumes" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.195504 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:38 crc kubenswrapper[4883]: W0310 09:24:38.201286 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2300d23_b25a_4e0d_a695_7c11709bfcda.slice/crio-ffd12d2f9453cb1bc4150b06edfaa9bcf485473227f096bcbefd8dc8faea0eca WatchSource:0}: Error finding container ffd12d2f9453cb1bc4150b06edfaa9bcf485473227f096bcbefd8dc8faea0eca: Status 404 returned error can't find the container with id ffd12d2f9453cb1bc4150b06edfaa9bcf485473227f096bcbefd8dc8faea0eca Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.276139 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2300d23-b25a-4e0d-a695-7c11709bfcda","Type":"ContainerStarted","Data":"ffd12d2f9453cb1bc4150b06edfaa9bcf485473227f096bcbefd8dc8faea0eca"} Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.278102 4883 generic.go:334] "Generic (PLEG): container finished" podID="3d3a7934-1ab2-4013-b3ff-90859ffcc179" containerID="1b941b684a31273444bcbcfa6618c7c4de8354efb411f75cdb5b7cfd833281c3" exitCode=0 Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.278158 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rptkb" event={"ID":"3d3a7934-1ab2-4013-b3ff-90859ffcc179","Type":"ContainerDied","Data":"1b941b684a31273444bcbcfa6618c7c4de8354efb411f75cdb5b7cfd833281c3"} Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.280150 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s66f5" event={"ID":"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05","Type":"ContainerDied","Data":"4666671669cee93f50494e5681010f34c95c13f1664290315b5bce6c7f0d081c"} Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.280210 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4666671669cee93f50494e5681010f34c95c13f1664290315b5bce6c7f0d081c" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.280309 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.358097 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 09:24:38 crc kubenswrapper[4883]: E0310 09:24:38.358814 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" containerName="nova-cell1-conductor-db-sync" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.358836 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" containerName="nova-cell1-conductor-db-sync" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.359023 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" containerName="nova-cell1-conductor-db-sync" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.359732 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.361464 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.384646 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.493231 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90b06d82-9f07-4c29-9bad-987d2c6d027c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.493491 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b06d82-9f07-4c29-9bad-987d2c6d027c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.493650 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8n9t\" (UniqueName: \"kubernetes.io/projected/90b06d82-9f07-4c29-9bad-987d2c6d027c-kube-api-access-t8n9t\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.595219 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8n9t\" (UniqueName: \"kubernetes.io/projected/90b06d82-9f07-4c29-9bad-987d2c6d027c-kube-api-access-t8n9t\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.595446 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90b06d82-9f07-4c29-9bad-987d2c6d027c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.595576 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b06d82-9f07-4c29-9bad-987d2c6d027c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.600634 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90b06d82-9f07-4c29-9bad-987d2c6d027c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.601076 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b06d82-9f07-4c29-9bad-987d2c6d027c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.609763 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8n9t\" (UniqueName: \"kubernetes.io/projected/90b06d82-9f07-4c29-9bad-987d2c6d027c-kube-api-access-t8n9t\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.779888 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.185192 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 09:24:39 crc kubenswrapper[4883]: W0310 09:24:39.187848 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90b06d82_9f07_4c29_9bad_987d2c6d027c.slice/crio-eaf7cf2d83f4edaa00f3dd015a77894e6e02ae1414bba017094d3564d5386f66 WatchSource:0}: Error finding container eaf7cf2d83f4edaa00f3dd015a77894e6e02ae1414bba017094d3564d5386f66: Status 404 returned error can't find the container with id eaf7cf2d83f4edaa00f3dd015a77894e6e02ae1414bba017094d3564d5386f66 Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.293999 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"90b06d82-9f07-4c29-9bad-987d2c6d027c","Type":"ContainerStarted","Data":"eaf7cf2d83f4edaa00f3dd015a77894e6e02ae1414bba017094d3564d5386f66"} Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.296802 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2300d23-b25a-4e0d-a695-7c11709bfcda","Type":"ContainerStarted","Data":"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af"} Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.296867 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2300d23-b25a-4e0d-a695-7c11709bfcda","Type":"ContainerStarted","Data":"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff"} Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.327662 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.327647916 podStartE2EDuration="2.327647916s" podCreationTimestamp="2026-03-10 09:24:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:39.317387498 +0000 UTC m=+1265.572285387" watchObservedRunningTime="2026-03-10 09:24:39.327647916 +0000 UTC m=+1265.582545806" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.579559 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.726578 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbkj2\" (UniqueName: \"kubernetes.io/projected/3d3a7934-1ab2-4013-b3ff-90859ffcc179-kube-api-access-hbkj2\") pod \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.726632 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-scripts\") pod \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.727326 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-config-data\") pod \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.727387 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-combined-ca-bundle\") pod \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.733283 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d3a7934-1ab2-4013-b3ff-90859ffcc179-kube-api-access-hbkj2" (OuterVolumeSpecName: "kube-api-access-hbkj2") pod "3d3a7934-1ab2-4013-b3ff-90859ffcc179" (UID: "3d3a7934-1ab2-4013-b3ff-90859ffcc179"). InnerVolumeSpecName "kube-api-access-hbkj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.733649 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-scripts" (OuterVolumeSpecName: "scripts") pod "3d3a7934-1ab2-4013-b3ff-90859ffcc179" (UID: "3d3a7934-1ab2-4013-b3ff-90859ffcc179"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.753236 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d3a7934-1ab2-4013-b3ff-90859ffcc179" (UID: "3d3a7934-1ab2-4013-b3ff-90859ffcc179"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.753670 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-config-data" (OuterVolumeSpecName: "config-data") pod "3d3a7934-1ab2-4013-b3ff-90859ffcc179" (UID: "3d3a7934-1ab2-4013-b3ff-90859ffcc179"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.829312 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbkj2\" (UniqueName: \"kubernetes.io/projected/3d3a7934-1ab2-4013-b3ff-90859ffcc179-kube-api-access-hbkj2\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.829335 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.829345 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.829389 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.313557 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.313528 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rptkb" event={"ID":"3d3a7934-1ab2-4013-b3ff-90859ffcc179","Type":"ContainerDied","Data":"93bf15898cd55775fe1c641481b3b6079857f825b2b5abf0ebd10208dc8b1155"} Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.314103 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93bf15898cd55775fe1c641481b3b6079857f825b2b5abf0ebd10208dc8b1155" Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.318991 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"90b06d82-9f07-4c29-9bad-987d2c6d027c","Type":"ContainerStarted","Data":"1c550759d0b078fae7cdd0a6a35ebfbeea86ecb02fac08bf2b4feb61ea95f4b9"} Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.320168 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.335426 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.33540928 podStartE2EDuration="2.33540928s" podCreationTimestamp="2026-03-10 09:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:40.332605022 +0000 UTC m=+1266.587502911" watchObservedRunningTime="2026-03-10 09:24:40.33540928 +0000 UTC m=+1266.590307169" Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.478983 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.479290 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-log" containerID="cri-o://fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e" gracePeriod=30 Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.479998 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-api" containerID="cri-o://cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2" gracePeriod=30 Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.495230 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.495653 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" containerName="nova-scheduler-scheduler" containerID="cri-o://fda8a7af1c52f69233e2ff5b9090007030a88af5530808dd1c1ae5e86ac05d25" gracePeriod=30 Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.520806 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.859615 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.981011 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-mnnl8"] Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.990807 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerName="dnsmasq-dns" containerID="cri-o://cc94e4a4d546d543f44873ce6a1ec9593535c80494c566268aaa2c5b62a424b1" gracePeriod=10 Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.182611 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333494 4883 generic.go:334] "Generic (PLEG): container finished" podID="47cb531f-d85b-41f0-9608-a19b158679c7" containerID="cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2" exitCode=0 Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333553 4883 generic.go:334] "Generic (PLEG): container finished" podID="47cb531f-d85b-41f0-9608-a19b158679c7" containerID="fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e" exitCode=143 Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333613 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47cb531f-d85b-41f0-9608-a19b158679c7","Type":"ContainerDied","Data":"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2"} Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333667 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47cb531f-d85b-41f0-9608-a19b158679c7","Type":"ContainerDied","Data":"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e"} Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333679 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47cb531f-d85b-41f0-9608-a19b158679c7","Type":"ContainerDied","Data":"267bcf024c91a240e0cf8abd61f08e6de1d6f0aad3c2fdd15bea1caa97a544cf"} Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333703 4883 scope.go:117] "RemoveContainer" containerID="cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333911 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.337340 4883 generic.go:334] "Generic (PLEG): container finished" podID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerID="cc94e4a4d546d543f44873ce6a1ec9593535c80494c566268aaa2c5b62a424b1" exitCode=0 Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.337624 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" event={"ID":"fb652436-cf46-4a91-b358-f6c6a011cf43","Type":"ContainerDied","Data":"cc94e4a4d546d543f44873ce6a1ec9593535c80494c566268aaa2c5b62a424b1"} Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.337913 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-log" containerID="cri-o://7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff" gracePeriod=30 Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.338104 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-metadata" containerID="cri-o://e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af" gracePeriod=30 Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.370187 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg92q\" (UniqueName: \"kubernetes.io/projected/47cb531f-d85b-41f0-9608-a19b158679c7-kube-api-access-dg92q\") pod \"47cb531f-d85b-41f0-9608-a19b158679c7\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.370378 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-config-data\") pod \"47cb531f-d85b-41f0-9608-a19b158679c7\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.370490 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47cb531f-d85b-41f0-9608-a19b158679c7-logs\") pod \"47cb531f-d85b-41f0-9608-a19b158679c7\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.370576 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-combined-ca-bundle\") pod \"47cb531f-d85b-41f0-9608-a19b158679c7\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.371574 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47cb531f-d85b-41f0-9608-a19b158679c7-logs" (OuterVolumeSpecName: "logs") pod "47cb531f-d85b-41f0-9608-a19b158679c7" (UID: "47cb531f-d85b-41f0-9608-a19b158679c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.378563 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47cb531f-d85b-41f0-9608-a19b158679c7-kube-api-access-dg92q" (OuterVolumeSpecName: "kube-api-access-dg92q") pod "47cb531f-d85b-41f0-9608-a19b158679c7" (UID: "47cb531f-d85b-41f0-9608-a19b158679c7"). InnerVolumeSpecName "kube-api-access-dg92q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.384772 4883 scope.go:117] "RemoveContainer" containerID="fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.397733 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-config-data" (OuterVolumeSpecName: "config-data") pod "47cb531f-d85b-41f0-9608-a19b158679c7" (UID: "47cb531f-d85b-41f0-9608-a19b158679c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.400984 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47cb531f-d85b-41f0-9608-a19b158679c7" (UID: "47cb531f-d85b-41f0-9608-a19b158679c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.416713 4883 scope.go:117] "RemoveContainer" containerID="cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2" Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.417293 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2\": container with ID starting with cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2 not found: ID does not exist" containerID="cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.417336 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2"} err="failed to get container status \"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2\": rpc error: code = NotFound desc = could not find container \"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2\": container with ID starting with cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2 not found: ID does not exist" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.417364 4883 scope.go:117] "RemoveContainer" containerID="fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e" Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.420839 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e\": container with ID starting with fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e not found: ID does not exist" containerID="fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.420885 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e"} err="failed to get container status \"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e\": rpc error: code = NotFound desc = could not find container \"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e\": container with ID starting with fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e not found: ID does not exist" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.420915 4883 scope.go:117] "RemoveContainer" containerID="cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.421230 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2"} err="failed to get container status \"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2\": rpc error: code = NotFound desc = could not find container \"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2\": container with ID starting with cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2 not found: ID does not exist" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.421247 4883 scope.go:117] "RemoveContainer" containerID="fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.421509 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e"} err="failed to get container status \"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e\": rpc error: code = NotFound desc = could not find container \"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e\": container with ID starting with fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e not found: ID does not exist" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.440096 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.476791 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.476889 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47cb531f-d85b-41f0-9608-a19b158679c7-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.477010 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.477023 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg92q\" (UniqueName: \"kubernetes.io/projected/47cb531f-d85b-41f0-9608-a19b158679c7-kube-api-access-dg92q\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.579325 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-nb\") pod \"fb652436-cf46-4a91-b358-f6c6a011cf43\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.579799 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-swift-storage-0\") pod \"fb652436-cf46-4a91-b358-f6c6a011cf43\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.579850 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-config\") pod \"fb652436-cf46-4a91-b358-f6c6a011cf43\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.580053 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h2jc\" (UniqueName: \"kubernetes.io/projected/fb652436-cf46-4a91-b358-f6c6a011cf43-kube-api-access-9h2jc\") pod \"fb652436-cf46-4a91-b358-f6c6a011cf43\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.580126 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-sb\") pod \"fb652436-cf46-4a91-b358-f6c6a011cf43\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.580228 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-svc\") pod \"fb652436-cf46-4a91-b358-f6c6a011cf43\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.584253 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb652436-cf46-4a91-b358-f6c6a011cf43-kube-api-access-9h2jc" (OuterVolumeSpecName: "kube-api-access-9h2jc") pod "fb652436-cf46-4a91-b358-f6c6a011cf43" (UID: "fb652436-cf46-4a91-b358-f6c6a011cf43"). InnerVolumeSpecName "kube-api-access-9h2jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.621355 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fb652436-cf46-4a91-b358-f6c6a011cf43" (UID: "fb652436-cf46-4a91-b358-f6c6a011cf43"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.628649 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb652436-cf46-4a91-b358-f6c6a011cf43" (UID: "fb652436-cf46-4a91-b358-f6c6a011cf43"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.632192 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fb652436-cf46-4a91-b358-f6c6a011cf43" (UID: "fb652436-cf46-4a91-b358-f6c6a011cf43"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.633738 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.637637 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-config" (OuterVolumeSpecName: "config") pod "fb652436-cf46-4a91-b358-f6c6a011cf43" (UID: "fb652436-cf46-4a91-b358-f6c6a011cf43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.645462 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fb652436-cf46-4a91-b358-f6c6a011cf43" (UID: "fb652436-cf46-4a91-b358-f6c6a011cf43"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.684465 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.684508 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.684521 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h2jc\" (UniqueName: \"kubernetes.io/projected/fb652436-cf46-4a91-b358-f6c6a011cf43-kube-api-access-9h2jc\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.684532 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.684544 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.684552 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.723564 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.728104 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.752833 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.753292 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-log" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753308 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-log" Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.753318 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3a7934-1ab2-4013-b3ff-90859ffcc179" containerName="nova-manage" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753325 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3a7934-1ab2-4013-b3ff-90859ffcc179" containerName="nova-manage" Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.753338 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerName="dnsmasq-dns" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753343 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerName="dnsmasq-dns" Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.753365 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerName="init" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753373 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerName="init" Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.753389 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-api" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753395 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-api" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753609 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3a7934-1ab2-4013-b3ff-90859ffcc179" containerName="nova-manage" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753635 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-api" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753643 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerName="dnsmasq-dns" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753663 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-log" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.754650 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.759907 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.761102 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.878506 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.888392 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-logs\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.888714 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-config-data\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.889042 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfvw5\" (UniqueName: \"kubernetes.io/projected/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-kube-api-access-rfvw5\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.889742 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.991754 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-config-data\") pod \"c2300d23-b25a-4e0d-a695-7c11709bfcda\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.991808 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-nova-metadata-tls-certs\") pod \"c2300d23-b25a-4e0d-a695-7c11709bfcda\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.991835 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2300d23-b25a-4e0d-a695-7c11709bfcda-logs\") pod \"c2300d23-b25a-4e0d-a695-7c11709bfcda\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.991937 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-combined-ca-bundle\") pod \"c2300d23-b25a-4e0d-a695-7c11709bfcda\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992053 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9f88\" (UniqueName: \"kubernetes.io/projected/c2300d23-b25a-4e0d-a695-7c11709bfcda-kube-api-access-f9f88\") pod \"c2300d23-b25a-4e0d-a695-7c11709bfcda\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992263 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2300d23-b25a-4e0d-a695-7c11709bfcda-logs" (OuterVolumeSpecName: "logs") pod "c2300d23-b25a-4e0d-a695-7c11709bfcda" (UID: "c2300d23-b25a-4e0d-a695-7c11709bfcda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992511 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992574 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-logs\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992661 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-config-data\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992734 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfvw5\" (UniqueName: \"kubernetes.io/projected/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-kube-api-access-rfvw5\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992798 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2300d23-b25a-4e0d-a695-7c11709bfcda-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.993391 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-logs\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.997882 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-config-data\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.998848 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.002548 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2300d23-b25a-4e0d-a695-7c11709bfcda-kube-api-access-f9f88" (OuterVolumeSpecName: "kube-api-access-f9f88") pod "c2300d23-b25a-4e0d-a695-7c11709bfcda" (UID: "c2300d23-b25a-4e0d-a695-7c11709bfcda"). InnerVolumeSpecName "kube-api-access-f9f88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.010908 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfvw5\" (UniqueName: \"kubernetes.io/projected/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-kube-api-access-rfvw5\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.020163 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2300d23-b25a-4e0d-a695-7c11709bfcda" (UID: "c2300d23-b25a-4e0d-a695-7c11709bfcda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.024431 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-config-data" (OuterVolumeSpecName: "config-data") pod "c2300d23-b25a-4e0d-a695-7c11709bfcda" (UID: "c2300d23-b25a-4e0d-a695-7c11709bfcda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.050646 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c2300d23-b25a-4e0d-a695-7c11709bfcda" (UID: "c2300d23-b25a-4e0d-a695-7c11709bfcda"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.077712 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.095283 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.095316 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9f88\" (UniqueName: \"kubernetes.io/projected/c2300d23-b25a-4e0d-a695-7c11709bfcda-kube-api-access-f9f88\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.095330 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.095341 4883 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.109754 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" path="/var/lib/kubelet/pods/47cb531f-d85b-41f0-9608-a19b158679c7/volumes" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.352786 4883 generic.go:334] "Generic (PLEG): container finished" podID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerID="e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af" exitCode=0 Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.353071 4883 generic.go:334] "Generic (PLEG): container finished" podID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerID="7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff" exitCode=143 Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.352877 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.352905 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2300d23-b25a-4e0d-a695-7c11709bfcda","Type":"ContainerDied","Data":"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af"} Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.353224 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2300d23-b25a-4e0d-a695-7c11709bfcda","Type":"ContainerDied","Data":"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff"} Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.353259 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2300d23-b25a-4e0d-a695-7c11709bfcda","Type":"ContainerDied","Data":"ffd12d2f9453cb1bc4150b06edfaa9bcf485473227f096bcbefd8dc8faea0eca"} Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.353279 4883 scope.go:117] "RemoveContainer" containerID="e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.361538 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" event={"ID":"fb652436-cf46-4a91-b358-f6c6a011cf43","Type":"ContainerDied","Data":"f3dfd9c8abe53e2f4e70fd004e6457ef025d9a2c819617d0dfac05e54db79843"} Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.361720 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.382447 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.391794 4883 scope.go:117] "RemoveContainer" containerID="7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.419374 4883 scope.go:117] "RemoveContainer" containerID="e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af" Mar 10 09:24:42 crc kubenswrapper[4883]: E0310 09:24:42.420467 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af\": container with ID starting with e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af not found: ID does not exist" containerID="e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.420556 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af"} err="failed to get container status \"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af\": rpc error: code = NotFound desc = could not find container \"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af\": container with ID starting with e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af not found: ID does not exist" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.420591 4883 scope.go:117] "RemoveContainer" containerID="7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff" Mar 10 09:24:42 crc kubenswrapper[4883]: E0310 09:24:42.421824 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff\": container with ID starting with 7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff not found: ID does not exist" containerID="7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.421851 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff"} err="failed to get container status \"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff\": rpc error: code = NotFound desc = could not find container \"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff\": container with ID starting with 7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff not found: ID does not exist" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.421868 4883 scope.go:117] "RemoveContainer" containerID="e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.422168 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af"} err="failed to get container status \"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af\": rpc error: code = NotFound desc = could not find container \"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af\": container with ID starting with e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af not found: ID does not exist" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.422184 4883 scope.go:117] "RemoveContainer" containerID="7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.422437 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff"} err="failed to get container status \"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff\": rpc error: code = NotFound desc = could not find container \"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff\": container with ID starting with 7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff not found: ID does not exist" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.422469 4883 scope.go:117] "RemoveContainer" containerID="cc94e4a4d546d543f44873ce6a1ec9593535c80494c566268aaa2c5b62a424b1" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.427280 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.445164 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:42 crc kubenswrapper[4883]: E0310 09:24:42.445872 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-log" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.445891 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-log" Mar 10 09:24:42 crc kubenswrapper[4883]: E0310 09:24:42.446127 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-metadata" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.446139 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-metadata" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.446460 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-metadata" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.446536 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-log" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.448529 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.450836 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.454053 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.455609 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-mnnl8"] Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.455977 4883 scope.go:117] "RemoveContainer" containerID="8dc17cfa98b4f413422ff6ec7b4debd0ca6ed29db8f51bb73e604fc0c8aedd72" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.466166 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-mnnl8"] Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.471923 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.487572 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:42 crc kubenswrapper[4883]: W0310 09:24:42.489713 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf34d69a2_fd0d_42e4_942f_178dbf2c1b55.slice/crio-b39a955be44f48c7e8bcca14deb268540526603ce0e5d25ae8ee8ee9e00bbf62 WatchSource:0}: Error finding container b39a955be44f48c7e8bcca14deb268540526603ce0e5d25ae8ee8ee9e00bbf62: Status 404 returned error can't find the container with id b39a955be44f48c7e8bcca14deb268540526603ce0e5d25ae8ee8ee9e00bbf62 Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.609237 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.609495 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.609535 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-config-data\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.609554 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6k6s\" (UniqueName: \"kubernetes.io/projected/398e71db-8c97-477b-b92c-35829f9b7dee-kube-api-access-j6k6s\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.609578 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398e71db-8c97-477b-b92c-35829f9b7dee-logs\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.712198 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.712261 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.712300 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-config-data\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.712323 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6k6s\" (UniqueName: \"kubernetes.io/projected/398e71db-8c97-477b-b92c-35829f9b7dee-kube-api-access-j6k6s\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.712353 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398e71db-8c97-477b-b92c-35829f9b7dee-logs\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.713394 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398e71db-8c97-477b-b92c-35829f9b7dee-logs\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.716612 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.717389 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-config-data\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.717828 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.729005 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6k6s\" (UniqueName: \"kubernetes.io/projected/398e71db-8c97-477b-b92c-35829f9b7dee-kube-api-access-j6k6s\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.766414 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:43 crc kubenswrapper[4883]: I0310 09:24:43.378209 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34d69a2-fd0d-42e4-942f-178dbf2c1b55","Type":"ContainerStarted","Data":"9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8"} Mar 10 09:24:43 crc kubenswrapper[4883]: I0310 09:24:43.378501 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34d69a2-fd0d-42e4-942f-178dbf2c1b55","Type":"ContainerStarted","Data":"162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a"} Mar 10 09:24:43 crc kubenswrapper[4883]: I0310 09:24:43.378514 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34d69a2-fd0d-42e4-942f-178dbf2c1b55","Type":"ContainerStarted","Data":"b39a955be44f48c7e8bcca14deb268540526603ce0e5d25ae8ee8ee9e00bbf62"} Mar 10 09:24:43 crc kubenswrapper[4883]: I0310 09:24:43.407297 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.407278218 podStartE2EDuration="2.407278218s" podCreationTimestamp="2026-03-10 09:24:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:43.394250541 +0000 UTC m=+1269.649148420" watchObservedRunningTime="2026-03-10 09:24:43.407278218 +0000 UTC m=+1269.662176106" Mar 10 09:24:43 crc kubenswrapper[4883]: I0310 09:24:43.791864 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:43 crc kubenswrapper[4883]: W0310 09:24:43.796581 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod398e71db_8c97_477b_b92c_35829f9b7dee.slice/crio-f719cd0714eff1ee6766787e51cf9b5c14efc5a4992031e0aeeeb87d03232888 WatchSource:0}: Error finding container f719cd0714eff1ee6766787e51cf9b5c14efc5a4992031e0aeeeb87d03232888: Status 404 returned error can't find the container with id f719cd0714eff1ee6766787e51cf9b5c14efc5a4992031e0aeeeb87d03232888 Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.104926 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" path="/var/lib/kubelet/pods/c2300d23-b25a-4e0d-a695-7c11709bfcda/volumes" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.114245 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" path="/var/lib/kubelet/pods/fb652436-cf46-4a91-b358-f6c6a011cf43/volumes" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.391542 4883 generic.go:334] "Generic (PLEG): container finished" podID="0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" containerID="fda8a7af1c52f69233e2ff5b9090007030a88af5530808dd1c1ae5e86ac05d25" exitCode=0 Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.391640 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32","Type":"ContainerDied","Data":"fda8a7af1c52f69233e2ff5b9090007030a88af5530808dd1c1ae5e86ac05d25"} Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.399719 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"398e71db-8c97-477b-b92c-35829f9b7dee","Type":"ContainerStarted","Data":"903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f"} Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.399768 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"398e71db-8c97-477b-b92c-35829f9b7dee","Type":"ContainerStarted","Data":"d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98"} Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.399784 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"398e71db-8c97-477b-b92c-35829f9b7dee","Type":"ContainerStarted","Data":"f719cd0714eff1ee6766787e51cf9b5c14efc5a4992031e0aeeeb87d03232888"} Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.428277 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.428255633 podStartE2EDuration="2.428255633s" podCreationTimestamp="2026-03-10 09:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:44.42183182 +0000 UTC m=+1270.676729729" watchObservedRunningTime="2026-03-10 09:24:44.428255633 +0000 UTC m=+1270.683153521" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.563333 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.660348 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-config-data\") pod \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.660439 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcsm2\" (UniqueName: \"kubernetes.io/projected/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-kube-api-access-qcsm2\") pod \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.660601 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-combined-ca-bundle\") pod \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.665561 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-kube-api-access-qcsm2" (OuterVolumeSpecName: "kube-api-access-qcsm2") pod "0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" (UID: "0e8e02c2-3c36-440a-b7aa-d39b27f3bd32"). InnerVolumeSpecName "kube-api-access-qcsm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.683796 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-config-data" (OuterVolumeSpecName: "config-data") pod "0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" (UID: "0e8e02c2-3c36-440a-b7aa-d39b27f3bd32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.684354 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" (UID: "0e8e02c2-3c36-440a-b7aa-d39b27f3bd32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.763845 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.763874 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcsm2\" (UniqueName: \"kubernetes.io/projected/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-kube-api-access-qcsm2\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.763890 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.412223 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32","Type":"ContainerDied","Data":"3df63ff82b84df0c9c734c9449801557a1a4dedf62a22952150b25c471cef32b"} Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.412312 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.412355 4883 scope.go:117] "RemoveContainer" containerID="fda8a7af1c52f69233e2ff5b9090007030a88af5530808dd1c1ae5e86ac05d25" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.456309 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.466126 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.473519 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:45 crc kubenswrapper[4883]: E0310 09:24:45.474163 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" containerName="nova-scheduler-scheduler" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.474227 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" containerName="nova-scheduler-scheduler" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.474492 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" containerName="nova-scheduler-scheduler" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.475437 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.479435 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.485961 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.578565 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvl78\" (UniqueName: \"kubernetes.io/projected/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-kube-api-access-xvl78\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.578743 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-config-data\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.578910 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.680439 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-config-data\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.680515 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.680613 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvl78\" (UniqueName: \"kubernetes.io/projected/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-kube-api-access-xvl78\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.686277 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-config-data\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.686814 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.697466 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvl78\" (UniqueName: \"kubernetes.io/projected/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-kube-api-access-xvl78\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.796274 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.091328 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" path="/var/lib/kubelet/pods/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32/volumes" Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.197599 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:46 crc kubenswrapper[4883]: W0310 09:24:46.198062 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36ece3b9_2a8b_4cfd_b78c_09adc594ac3b.slice/crio-e02bf1bb21736ce1a051604e9980924295d62264239f30728f6dd60f080541da WatchSource:0}: Error finding container e02bf1bb21736ce1a051604e9980924295d62264239f30728f6dd60f080541da: Status 404 returned error can't find the container with id e02bf1bb21736ce1a051604e9980924295d62264239f30728f6dd60f080541da Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.426219 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b","Type":"ContainerStarted","Data":"6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5"} Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.427008 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b","Type":"ContainerStarted","Data":"e02bf1bb21736ce1a051604e9980924295d62264239f30728f6dd60f080541da"} Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.449162 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.449138649 podStartE2EDuration="1.449138649s" podCreationTimestamp="2026-03-10 09:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:46.443393435 +0000 UTC m=+1272.698291325" watchObservedRunningTime="2026-03-10 09:24:46.449138649 +0000 UTC m=+1272.704036537" Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.633196 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.655365 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:47 crc kubenswrapper[4883]: I0310 09:24:47.459738 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:47 crc kubenswrapper[4883]: I0310 09:24:47.767645 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 09:24:47 crc kubenswrapper[4883]: I0310 09:24:47.767734 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 09:24:48 crc kubenswrapper[4883]: I0310 09:24:48.807226 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.219679 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-j9wml"] Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.221276 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.223229 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.223490 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.237973 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-j9wml"] Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.370554 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9dg2\" (UniqueName: \"kubernetes.io/projected/cf096652-ae85-4c98-8821-cd47eafae98f-kube-api-access-t9dg2\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.370919 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.371166 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-scripts\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.371558 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-config-data\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.473810 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-config-data\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.473893 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9dg2\" (UniqueName: \"kubernetes.io/projected/cf096652-ae85-4c98-8821-cd47eafae98f-kube-api-access-t9dg2\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.474055 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.474112 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-scripts\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.481345 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-scripts\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.482229 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-config-data\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.482611 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.487502 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9dg2\" (UniqueName: \"kubernetes.io/projected/cf096652-ae85-4c98-8821-cd47eafae98f-kube-api-access-t9dg2\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.540380 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.948354 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-j9wml"] Mar 10 09:24:49 crc kubenswrapper[4883]: W0310 09:24:49.950404 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf096652_ae85_4c98_8821_cd47eafae98f.slice/crio-b49e06ef6cb80ccba7934e35c1c39b14ac278a363b273dc9d1720347e03cdc56 WatchSource:0}: Error finding container b49e06ef6cb80ccba7934e35c1c39b14ac278a363b273dc9d1720347e03cdc56: Status 404 returned error can't find the container with id b49e06ef6cb80ccba7934e35c1c39b14ac278a363b273dc9d1720347e03cdc56 Mar 10 09:24:50 crc kubenswrapper[4883]: I0310 09:24:50.487315 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j9wml" event={"ID":"cf096652-ae85-4c98-8821-cd47eafae98f","Type":"ContainerStarted","Data":"354a3c28cf873c296da4850a04f59db6b94dbed1b8ba2c5447e9a4525bd40aed"} Mar 10 09:24:50 crc kubenswrapper[4883]: I0310 09:24:50.487692 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j9wml" event={"ID":"cf096652-ae85-4c98-8821-cd47eafae98f","Type":"ContainerStarted","Data":"b49e06ef6cb80ccba7934e35c1c39b14ac278a363b273dc9d1720347e03cdc56"} Mar 10 09:24:50 crc kubenswrapper[4883]: I0310 09:24:50.517280 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-j9wml" podStartSLOduration=1.5172581539999999 podStartE2EDuration="1.517258154s" podCreationTimestamp="2026-03-10 09:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:50.505093076 +0000 UTC m=+1276.759990965" watchObservedRunningTime="2026-03-10 09:24:50.517258154 +0000 UTC m=+1276.772156044" Mar 10 09:24:50 crc kubenswrapper[4883]: I0310 09:24:50.796854 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 09:24:52 crc kubenswrapper[4883]: I0310 09:24:52.078709 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:24:52 crc kubenswrapper[4883]: I0310 09:24:52.079140 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:24:52 crc kubenswrapper[4883]: I0310 09:24:52.767883 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 09:24:52 crc kubenswrapper[4883]: I0310 09:24:52.767941 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 09:24:53 crc kubenswrapper[4883]: I0310 09:24:53.119688 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:24:53 crc kubenswrapper[4883]: I0310 09:24:53.161329 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:24:53 crc kubenswrapper[4883]: I0310 09:24:53.782699 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:24:53 crc kubenswrapper[4883]: I0310 09:24:53.782705 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:24:54 crc kubenswrapper[4883]: I0310 09:24:54.525395 4883 generic.go:334] "Generic (PLEG): container finished" podID="cf096652-ae85-4c98-8821-cd47eafae98f" containerID="354a3c28cf873c296da4850a04f59db6b94dbed1b8ba2c5447e9a4525bd40aed" exitCode=0 Mar 10 09:24:54 crc kubenswrapper[4883]: I0310 09:24:54.525486 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j9wml" event={"ID":"cf096652-ae85-4c98-8821-cd47eafae98f","Type":"ContainerDied","Data":"354a3c28cf873c296da4850a04f59db6b94dbed1b8ba2c5447e9a4525bd40aed"} Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.365285 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.797269 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.819904 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.832216 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.901759 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-combined-ca-bundle\") pod \"cf096652-ae85-4c98-8821-cd47eafae98f\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.901896 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-scripts\") pod \"cf096652-ae85-4c98-8821-cd47eafae98f\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.901979 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-config-data\") pod \"cf096652-ae85-4c98-8821-cd47eafae98f\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.902042 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9dg2\" (UniqueName: \"kubernetes.io/projected/cf096652-ae85-4c98-8821-cd47eafae98f-kube-api-access-t9dg2\") pod \"cf096652-ae85-4c98-8821-cd47eafae98f\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.908966 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf096652-ae85-4c98-8821-cd47eafae98f-kube-api-access-t9dg2" (OuterVolumeSpecName: "kube-api-access-t9dg2") pod "cf096652-ae85-4c98-8821-cd47eafae98f" (UID: "cf096652-ae85-4c98-8821-cd47eafae98f"). InnerVolumeSpecName "kube-api-access-t9dg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.911607 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-scripts" (OuterVolumeSpecName: "scripts") pod "cf096652-ae85-4c98-8821-cd47eafae98f" (UID: "cf096652-ae85-4c98-8821-cd47eafae98f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.928141 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf096652-ae85-4c98-8821-cd47eafae98f" (UID: "cf096652-ae85-4c98-8821-cd47eafae98f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.932712 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-config-data" (OuterVolumeSpecName: "config-data") pod "cf096652-ae85-4c98-8821-cd47eafae98f" (UID: "cf096652-ae85-4c98-8821-cd47eafae98f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.003063 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.003089 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9dg2\" (UniqueName: \"kubernetes.io/projected/cf096652-ae85-4c98-8821-cd47eafae98f-kube-api-access-t9dg2\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.003102 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.003111 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.548640 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j9wml" event={"ID":"cf096652-ae85-4c98-8821-cd47eafae98f","Type":"ContainerDied","Data":"b49e06ef6cb80ccba7934e35c1c39b14ac278a363b273dc9d1720347e03cdc56"} Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.548990 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b49e06ef6cb80ccba7934e35c1c39b14ac278a363b273dc9d1720347e03cdc56" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.548679 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.583034 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.720489 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.720790 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-log" containerID="cri-o://162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a" gracePeriod=30 Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.720875 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-api" containerID="cri-o://9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8" gracePeriod=30 Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.745423 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.745666 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-log" containerID="cri-o://d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98" gracePeriod=30 Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.745780 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-metadata" containerID="cri-o://903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f" gracePeriod=30 Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.991969 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:57 crc kubenswrapper[4883]: I0310 09:24:57.558073 4883 generic.go:334] "Generic (PLEG): container finished" podID="398e71db-8c97-477b-b92c-35829f9b7dee" containerID="d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98" exitCode=143 Mar 10 09:24:57 crc kubenswrapper[4883]: I0310 09:24:57.558143 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"398e71db-8c97-477b-b92c-35829f9b7dee","Type":"ContainerDied","Data":"d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98"} Mar 10 09:24:57 crc kubenswrapper[4883]: I0310 09:24:57.560124 4883 generic.go:334] "Generic (PLEG): container finished" podID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerID="162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a" exitCode=143 Mar 10 09:24:57 crc kubenswrapper[4883]: I0310 09:24:57.560213 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34d69a2-fd0d-42e4-942f-178dbf2c1b55","Type":"ContainerDied","Data":"162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a"} Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.406202 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.406442 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5094e588-6ef7-4214-a96e-26d75ad98977" containerName="kube-state-metrics" containerID="cri-o://c0101a4659f1590e10ca756263b45be2c4afc16ddc5690114bf2e3350ef9322b" gracePeriod=30 Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.571595 4883 generic.go:334] "Generic (PLEG): container finished" podID="5094e588-6ef7-4214-a96e-26d75ad98977" containerID="c0101a4659f1590e10ca756263b45be2c4afc16ddc5690114bf2e3350ef9322b" exitCode=2 Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.571690 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5094e588-6ef7-4214-a96e-26d75ad98977","Type":"ContainerDied","Data":"c0101a4659f1590e10ca756263b45be2c4afc16ddc5690114bf2e3350ef9322b"} Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.571859 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" containerName="nova-scheduler-scheduler" containerID="cri-o://6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" gracePeriod=30 Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.823351 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.856772 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k86gr\" (UniqueName: \"kubernetes.io/projected/5094e588-6ef7-4214-a96e-26d75ad98977-kube-api-access-k86gr\") pod \"5094e588-6ef7-4214-a96e-26d75ad98977\" (UID: \"5094e588-6ef7-4214-a96e-26d75ad98977\") " Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.877163 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5094e588-6ef7-4214-a96e-26d75ad98977-kube-api-access-k86gr" (OuterVolumeSpecName: "kube-api-access-k86gr") pod "5094e588-6ef7-4214-a96e-26d75ad98977" (UID: "5094e588-6ef7-4214-a96e-26d75ad98977"). InnerVolumeSpecName "kube-api-access-k86gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.959427 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k86gr\" (UniqueName: \"kubernetes.io/projected/5094e588-6ef7-4214-a96e-26d75ad98977-kube-api-access-k86gr\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.586326 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5094e588-6ef7-4214-a96e-26d75ad98977","Type":"ContainerDied","Data":"8cea65d53e3ef54f53f5028aee0f66a936c83a86f17bc4112a8c38018507f5cd"} Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.586408 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.586416 4883 scope.go:117] "RemoveContainer" containerID="c0101a4659f1590e10ca756263b45be2c4afc16ddc5690114bf2e3350ef9322b" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.644857 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.659325 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.662046 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:24:59 crc kubenswrapper[4883]: E0310 09:24:59.662961 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5094e588-6ef7-4214-a96e-26d75ad98977" containerName="kube-state-metrics" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.662986 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="5094e588-6ef7-4214-a96e-26d75ad98977" containerName="kube-state-metrics" Mar 10 09:24:59 crc kubenswrapper[4883]: E0310 09:24:59.663013 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf096652-ae85-4c98-8821-cd47eafae98f" containerName="nova-manage" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.663021 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf096652-ae85-4c98-8821-cd47eafae98f" containerName="nova-manage" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.663196 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="5094e588-6ef7-4214-a96e-26d75ad98977" containerName="kube-state-metrics" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.663218 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf096652-ae85-4c98-8821-cd47eafae98f" containerName="nova-manage" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.663921 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.669885 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.670253 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.674100 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.777007 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.777054 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.777132 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.777224 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp7c5\" (UniqueName: \"kubernetes.io/projected/39c373dd-952a-4305-82ed-1d047c7a859f-kube-api-access-rp7c5\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.879808 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.879911 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp7c5\" (UniqueName: \"kubernetes.io/projected/39c373dd-952a-4305-82ed-1d047c7a859f-kube-api-access-rp7c5\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.879983 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.880014 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.886228 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.886288 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.886972 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.895026 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp7c5\" (UniqueName: \"kubernetes.io/projected/39c373dd-952a-4305-82ed-1d047c7a859f-kube-api-access-rp7c5\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.991468 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.079794 4883 scope.go:117] "RemoveContainer" containerID="e20d3f6d5f3aae231c536075cd1098cf482fcd5c0cc1095b975e4d04ba285b0b" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.099301 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5094e588-6ef7-4214-a96e-26d75ad98977" path="/var/lib/kubelet/pods/5094e588-6ef7-4214-a96e-26d75ad98977/volumes" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.147313 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.147616 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-central-agent" containerID="cri-o://1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e" gracePeriod=30 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.148074 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="proxy-httpd" containerID="cri-o://ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb" gracePeriod=30 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.148136 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="sg-core" containerID="cri-o://3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96" gracePeriod=30 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.148181 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-notification-agent" containerID="cri-o://aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830" gracePeriod=30 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.254224 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.328807 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.388950 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-config-data\") pod \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.389052 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfvw5\" (UniqueName: \"kubernetes.io/projected/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-kube-api-access-rfvw5\") pod \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.389170 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-logs\") pod \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.389198 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-combined-ca-bundle\") pod \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.389605 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-logs" (OuterVolumeSpecName: "logs") pod "f34d69a2-fd0d-42e4-942f-178dbf2c1b55" (UID: "f34d69a2-fd0d-42e4-942f-178dbf2c1b55"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.391761 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.396099 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-kube-api-access-rfvw5" (OuterVolumeSpecName: "kube-api-access-rfvw5") pod "f34d69a2-fd0d-42e4-942f-178dbf2c1b55" (UID: "f34d69a2-fd0d-42e4-942f-178dbf2c1b55"). InnerVolumeSpecName "kube-api-access-rfvw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.414133 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f34d69a2-fd0d-42e4-942f-178dbf2c1b55" (UID: "f34d69a2-fd0d-42e4-942f-178dbf2c1b55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.431866 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-config-data" (OuterVolumeSpecName: "config-data") pod "f34d69a2-fd0d-42e4-942f-178dbf2c1b55" (UID: "f34d69a2-fd0d-42e4-942f-178dbf2c1b55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: W0310 09:25:00.479229 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39c373dd_952a_4305_82ed_1d047c7a859f.slice/crio-501a3f53c407e29c3c2fa4c2789e88141b465b69eb039eedcd5031106e79d6d9 WatchSource:0}: Error finding container 501a3f53c407e29c3c2fa4c2789e88141b465b69eb039eedcd5031106e79d6d9: Status 404 returned error can't find the container with id 501a3f53c407e29c3c2fa4c2789e88141b465b69eb039eedcd5031106e79d6d9 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.481532 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.493966 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-combined-ca-bundle\") pod \"398e71db-8c97-477b-b92c-35829f9b7dee\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.494247 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-nova-metadata-tls-certs\") pod \"398e71db-8c97-477b-b92c-35829f9b7dee\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.494653 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-config-data\") pod \"398e71db-8c97-477b-b92c-35829f9b7dee\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.494692 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6k6s\" (UniqueName: \"kubernetes.io/projected/398e71db-8c97-477b-b92c-35829f9b7dee-kube-api-access-j6k6s\") pod \"398e71db-8c97-477b-b92c-35829f9b7dee\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.494748 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398e71db-8c97-477b-b92c-35829f9b7dee-logs\") pod \"398e71db-8c97-477b-b92c-35829f9b7dee\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.495613 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.495661 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfvw5\" (UniqueName: \"kubernetes.io/projected/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-kube-api-access-rfvw5\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.495674 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.496147 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398e71db-8c97-477b-b92c-35829f9b7dee-logs" (OuterVolumeSpecName: "logs") pod "398e71db-8c97-477b-b92c-35829f9b7dee" (UID: "398e71db-8c97-477b-b92c-35829f9b7dee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.502404 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398e71db-8c97-477b-b92c-35829f9b7dee-kube-api-access-j6k6s" (OuterVolumeSpecName: "kube-api-access-j6k6s") pod "398e71db-8c97-477b-b92c-35829f9b7dee" (UID: "398e71db-8c97-477b-b92c-35829f9b7dee"). InnerVolumeSpecName "kube-api-access-j6k6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.516087 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "398e71db-8c97-477b-b92c-35829f9b7dee" (UID: "398e71db-8c97-477b-b92c-35829f9b7dee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.521803 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-config-data" (OuterVolumeSpecName: "config-data") pod "398e71db-8c97-477b-b92c-35829f9b7dee" (UID: "398e71db-8c97-477b-b92c-35829f9b7dee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.534882 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "398e71db-8c97-477b-b92c-35829f9b7dee" (UID: "398e71db-8c97-477b-b92c-35829f9b7dee"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.598876 4883 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.598913 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.598930 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6k6s\" (UniqueName: \"kubernetes.io/projected/398e71db-8c97-477b-b92c-35829f9b7dee-kube-api-access-j6k6s\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.598944 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398e71db-8c97-477b-b92c-35829f9b7dee-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.598955 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.600137 4883 generic.go:334] "Generic (PLEG): container finished" podID="398e71db-8c97-477b-b92c-35829f9b7dee" containerID="903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f" exitCode=0 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.600215 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.600238 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"398e71db-8c97-477b-b92c-35829f9b7dee","Type":"ContainerDied","Data":"903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.600317 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"398e71db-8c97-477b-b92c-35829f9b7dee","Type":"ContainerDied","Data":"f719cd0714eff1ee6766787e51cf9b5c14efc5a4992031e0aeeeb87d03232888"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.600362 4883 scope.go:117] "RemoveContainer" containerID="903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.602598 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"39c373dd-952a-4305-82ed-1d047c7a859f","Type":"ContainerStarted","Data":"501a3f53c407e29c3c2fa4c2789e88141b465b69eb039eedcd5031106e79d6d9"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.605121 4883 generic.go:334] "Generic (PLEG): container finished" podID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerID="9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8" exitCode=0 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.605227 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.605235 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34d69a2-fd0d-42e4-942f-178dbf2c1b55","Type":"ContainerDied","Data":"9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.605301 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34d69a2-fd0d-42e4-942f-178dbf2c1b55","Type":"ContainerDied","Data":"b39a955be44f48c7e8bcca14deb268540526603ce0e5d25ae8ee8ee9e00bbf62"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.610978 4883 generic.go:334] "Generic (PLEG): container finished" podID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerID="ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb" exitCode=0 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.611076 4883 generic.go:334] "Generic (PLEG): container finished" podID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerID="3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96" exitCode=2 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.611086 4883 generic.go:334] "Generic (PLEG): container finished" podID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerID="1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e" exitCode=0 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.611110 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerDied","Data":"ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.611145 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerDied","Data":"3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.611159 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerDied","Data":"1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.645117 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.659363 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.673904 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.683360 4883 scope.go:117] "RemoveContainer" containerID="d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.686360 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.686831 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-metadata" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.686851 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-metadata" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.686873 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-api" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.686879 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-api" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.686908 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-log" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.686915 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-log" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.686932 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-log" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.686938 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-log" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.687106 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-log" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.687126 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-api" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.687138 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-metadata" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.687149 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-log" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.688105 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.690492 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.708317 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.725576 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.725752 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-config-data\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.725778 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.727506 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-logs\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.727585 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc5zn\" (UniqueName: \"kubernetes.io/projected/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-kube-api-access-kc5zn\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.729059 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.747607 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.749865 4883 scope.go:117] "RemoveContainer" containerID="903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.750265 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f\": container with ID starting with 903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f not found: ID does not exist" containerID="903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.750310 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f"} err="failed to get container status \"903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f\": rpc error: code = NotFound desc = could not find container \"903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f\": container with ID starting with 903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f not found: ID does not exist" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.750336 4883 scope.go:117] "RemoveContainer" containerID="d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.750850 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98\": container with ID starting with d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98 not found: ID does not exist" containerID="d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.750881 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98"} err="failed to get container status \"d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98\": rpc error: code = NotFound desc = could not find container \"d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98\": container with ID starting with d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98 not found: ID does not exist" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.750905 4883 scope.go:117] "RemoveContainer" containerID="9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.758537 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.760377 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.762217 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.766508 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.773118 4883 scope.go:117] "RemoveContainer" containerID="162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.789230 4883 scope.go:117] "RemoveContainer" containerID="9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.790235 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8\": container with ID starting with 9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8 not found: ID does not exist" containerID="9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.790283 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8"} err="failed to get container status \"9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8\": rpc error: code = NotFound desc = could not find container \"9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8\": container with ID starting with 9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8 not found: ID does not exist" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.790314 4883 scope.go:117] "RemoveContainer" containerID="162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.790686 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a\": container with ID starting with 162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a not found: ID does not exist" containerID="162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.790720 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a"} err="failed to get container status \"162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a\": rpc error: code = NotFound desc = could not find container \"162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a\": container with ID starting with 162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a not found: ID does not exist" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.798236 4883 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.799941 4883 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.801306 4883 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.801344 4883 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" containerName="nova-scheduler-scheduler" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828526 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f69269f-4be5-4302-b2ad-8f38012ef305-logs\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828571 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-config-data\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828642 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828711 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-config-data\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828789 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828831 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-logs\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828898 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc5zn\" (UniqueName: \"kubernetes.io/projected/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-kube-api-access-kc5zn\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.829127 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb2dl\" (UniqueName: \"kubernetes.io/projected/6f69269f-4be5-4302-b2ad-8f38012ef305-kube-api-access-cb2dl\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.829302 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-logs\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.829407 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.832122 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-config-data\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.833943 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.834393 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.847909 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc5zn\" (UniqueName: \"kubernetes.io/projected/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-kube-api-access-kc5zn\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.930322 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb2dl\" (UniqueName: \"kubernetes.io/projected/6f69269f-4be5-4302-b2ad-8f38012ef305-kube-api-access-cb2dl\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.930430 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f69269f-4be5-4302-b2ad-8f38012ef305-logs\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.930502 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-config-data\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.930554 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.931369 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f69269f-4be5-4302-b2ad-8f38012ef305-logs\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.934448 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.935771 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-config-data\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.944892 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb2dl\" (UniqueName: \"kubernetes.io/projected/6f69269f-4be5-4302-b2ad-8f38012ef305-kube-api-access-cb2dl\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.033081 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.074198 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:01 crc kubenswrapper[4883]: W0310 09:25:01.503838 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0743bd84_b1d5_4634_9a7f_2c9daf2a5994.slice/crio-b89a274b6b99b40e5c7418d9be1c288fba7e8c8f8b645b4ae771621c50a66315 WatchSource:0}: Error finding container b89a274b6b99b40e5c7418d9be1c288fba7e8c8f8b645b4ae771621c50a66315: Status 404 returned error can't find the container with id b89a274b6b99b40e5c7418d9be1c288fba7e8c8f8b645b4ae771621c50a66315 Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.505376 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.555890 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:01 crc kubenswrapper[4883]: W0310 09:25:01.562783 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f69269f_4be5_4302_b2ad_8f38012ef305.slice/crio-f6397c2d10b284f9002d1ac8c830d0d1865e88294fe5c8f541bafc5bd527b7d0 WatchSource:0}: Error finding container f6397c2d10b284f9002d1ac8c830d0d1865e88294fe5c8f541bafc5bd527b7d0: Status 404 returned error can't find the container with id f6397c2d10b284f9002d1ac8c830d0d1865e88294fe5c8f541bafc5bd527b7d0 Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.624062 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"39c373dd-952a-4305-82ed-1d047c7a859f","Type":"ContainerStarted","Data":"2c87402ad2966ba02b67f901907c0989443a467ee915fa75f74aa0bd8d1b8283"} Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.624517 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.627710 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0743bd84-b1d5-4634-9a7f-2c9daf2a5994","Type":"ContainerStarted","Data":"b89a274b6b99b40e5c7418d9be1c288fba7e8c8f8b645b4ae771621c50a66315"} Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.631613 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f69269f-4be5-4302-b2ad-8f38012ef305","Type":"ContainerStarted","Data":"f6397c2d10b284f9002d1ac8c830d0d1865e88294fe5c8f541bafc5bd527b7d0"} Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.652009 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.354639689 podStartE2EDuration="2.651991424s" podCreationTimestamp="2026-03-10 09:24:59 +0000 UTC" firstStartedPulling="2026-03-10 09:25:00.482299674 +0000 UTC m=+1286.737197563" lastFinishedPulling="2026-03-10 09:25:00.779651408 +0000 UTC m=+1287.034549298" observedRunningTime="2026-03-10 09:25:01.640820079 +0000 UTC m=+1287.895717968" watchObservedRunningTime="2026-03-10 09:25:01.651991424 +0000 UTC m=+1287.906889314" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.089448 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" path="/var/lib/kubelet/pods/398e71db-8c97-477b-b92c-35829f9b7dee/volumes" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.090069 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" path="/var/lib/kubelet/pods/f34d69a2-fd0d-42e4-942f-178dbf2c1b55/volumes" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.283520 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.463270 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-config-data\") pod \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.463334 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvl78\" (UniqueName: \"kubernetes.io/projected/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-kube-api-access-xvl78\") pod \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.463374 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-combined-ca-bundle\") pod \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.469351 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-kube-api-access-xvl78" (OuterVolumeSpecName: "kube-api-access-xvl78") pod "36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" (UID: "36ece3b9-2a8b-4cfd-b78c-09adc594ac3b"). InnerVolumeSpecName "kube-api-access-xvl78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.493809 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" (UID: "36ece3b9-2a8b-4cfd-b78c-09adc594ac3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.494906 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-config-data" (OuterVolumeSpecName: "config-data") pod "36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" (UID: "36ece3b9-2a8b-4cfd-b78c-09adc594ac3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.565927 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvl78\" (UniqueName: \"kubernetes.io/projected/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-kube-api-access-xvl78\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.565957 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.565967 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.650912 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f69269f-4be5-4302-b2ad-8f38012ef305","Type":"ContainerStarted","Data":"4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367"} Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.650982 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f69269f-4be5-4302-b2ad-8f38012ef305","Type":"ContainerStarted","Data":"8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6"} Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.654177 4883 generic.go:334] "Generic (PLEG): container finished" podID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" exitCode=0 Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.654232 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.654230 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b","Type":"ContainerDied","Data":"6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5"} Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.654373 4883 scope.go:117] "RemoveContainer" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.654587 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b","Type":"ContainerDied","Data":"e02bf1bb21736ce1a051604e9980924295d62264239f30728f6dd60f080541da"} Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.656643 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0743bd84-b1d5-4634-9a7f-2c9daf2a5994","Type":"ContainerStarted","Data":"5d21d33e3fa85744f491a32aabda12b4855841d4a5e55770849f877479307046"} Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.656680 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0743bd84-b1d5-4634-9a7f-2c9daf2a5994","Type":"ContainerStarted","Data":"e290a0fe114ae46c4e34a57982c5ccdd3bc36a159ca8d0ca9f2bee7d317ec4a7"} Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.677358 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.677342776 podStartE2EDuration="2.677342776s" podCreationTimestamp="2026-03-10 09:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:25:02.672763332 +0000 UTC m=+1288.927661221" watchObservedRunningTime="2026-03-10 09:25:02.677342776 +0000 UTC m=+1288.932240666" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.680919 4883 scope.go:117] "RemoveContainer" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" Mar 10 09:25:02 crc kubenswrapper[4883]: E0310 09:25:02.681328 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5\": container with ID starting with 6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5 not found: ID does not exist" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.681408 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5"} err="failed to get container status \"6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5\": rpc error: code = NotFound desc = could not find container \"6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5\": container with ID starting with 6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5 not found: ID does not exist" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.696022 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.696007272 podStartE2EDuration="2.696007272s" podCreationTimestamp="2026-03-10 09:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:25:02.690498183 +0000 UTC m=+1288.945396073" watchObservedRunningTime="2026-03-10 09:25:02.696007272 +0000 UTC m=+1288.950905161" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.712298 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.722470 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.730135 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:25:02 crc kubenswrapper[4883]: E0310 09:25:02.730662 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" containerName="nova-scheduler-scheduler" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.730687 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" containerName="nova-scheduler-scheduler" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.730919 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" containerName="nova-scheduler-scheduler" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.731649 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.733544 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.738938 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.872070 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626b3115-ced1-45ea-8401-e2bd7e79a20c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.872158 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-265bk\" (UniqueName: \"kubernetes.io/projected/626b3115-ced1-45ea-8401-e2bd7e79a20c-kube-api-access-265bk\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.872202 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626b3115-ced1-45ea-8401-e2bd7e79a20c-config-data\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.973661 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626b3115-ced1-45ea-8401-e2bd7e79a20c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.974019 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-265bk\" (UniqueName: \"kubernetes.io/projected/626b3115-ced1-45ea-8401-e2bd7e79a20c-kube-api-access-265bk\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.974063 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626b3115-ced1-45ea-8401-e2bd7e79a20c-config-data\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.978567 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626b3115-ced1-45ea-8401-e2bd7e79a20c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.979671 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626b3115-ced1-45ea-8401-e2bd7e79a20c-config-data\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.989729 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-265bk\" (UniqueName: \"kubernetes.io/projected/626b3115-ced1-45ea-8401-e2bd7e79a20c-kube-api-access-265bk\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.050770 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.204230 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.383834 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-sg-core-conf-yaml\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.383935 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-log-httpd\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.383983 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-run-httpd\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.384038 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-config-data\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.384095 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpz25\" (UniqueName: \"kubernetes.io/projected/13fc6b71-b633-4726-ad0d-91a04b592d3b-kube-api-access-bpz25\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.384144 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-combined-ca-bundle\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.384169 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-scripts\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.386783 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.388218 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.393654 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13fc6b71-b633-4726-ad0d-91a04b592d3b-kube-api-access-bpz25" (OuterVolumeSpecName: "kube-api-access-bpz25") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "kube-api-access-bpz25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.399898 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-scripts" (OuterVolumeSpecName: "scripts") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.416543 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.451666 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.470740 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-config-data" (OuterVolumeSpecName: "config-data") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494605 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494646 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494659 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494670 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494678 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494687 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494696 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpz25\" (UniqueName: \"kubernetes.io/projected/13fc6b71-b633-4726-ad0d-91a04b592d3b-kube-api-access-bpz25\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.500994 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:25:03 crc kubenswrapper[4883]: W0310 09:25:03.501511 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod626b3115_ced1_45ea_8401_e2bd7e79a20c.slice/crio-6903f2284adfe3d3184cd96a9760058fc2a0a242254e6d1376e5c90116cd44ab WatchSource:0}: Error finding container 6903f2284adfe3d3184cd96a9760058fc2a0a242254e6d1376e5c90116cd44ab: Status 404 returned error can't find the container with id 6903f2284adfe3d3184cd96a9760058fc2a0a242254e6d1376e5c90116cd44ab Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.666781 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"626b3115-ced1-45ea-8401-e2bd7e79a20c","Type":"ContainerStarted","Data":"98df58f7e20401aef58c2be9f2ce9527fe59a42792b1b41f8df8033469fd8ae0"} Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.667122 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"626b3115-ced1-45ea-8401-e2bd7e79a20c","Type":"ContainerStarted","Data":"6903f2284adfe3d3184cd96a9760058fc2a0a242254e6d1376e5c90116cd44ab"} Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.670296 4883 generic.go:334] "Generic (PLEG): container finished" podID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerID="aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830" exitCode=0 Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.670367 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.670423 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerDied","Data":"aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830"} Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.670501 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerDied","Data":"e06324548e387289ee5784fe269a4d67cc71d75aeed67c453f124447b142b7b8"} Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.670527 4883 scope.go:117] "RemoveContainer" containerID="ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.695635 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.695569215 podStartE2EDuration="1.695569215s" podCreationTimestamp="2026-03-10 09:25:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:25:03.682002091 +0000 UTC m=+1289.936899980" watchObservedRunningTime="2026-03-10 09:25:03.695569215 +0000 UTC m=+1289.950467103" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.709503 4883 scope.go:117] "RemoveContainer" containerID="3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.719586 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.729786 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.734394 4883 scope.go:117] "RemoveContainer" containerID="aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.740402 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.744002 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-notification-agent" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744027 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-notification-agent" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.744043 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="proxy-httpd" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744120 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="proxy-httpd" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.744371 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-central-agent" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744387 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-central-agent" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.744403 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="sg-core" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744419 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="sg-core" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744612 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-notification-agent" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744634 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-central-agent" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744641 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="sg-core" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744654 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="proxy-httpd" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.746695 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.748675 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.751723 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.751798 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.753886 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.757933 4883 scope.go:117] "RemoveContainer" containerID="1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.782705 4883 scope.go:117] "RemoveContainer" containerID="ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.783128 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb\": container with ID starting with ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb not found: ID does not exist" containerID="ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.783160 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb"} err="failed to get container status \"ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb\": rpc error: code = NotFound desc = could not find container \"ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb\": container with ID starting with ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb not found: ID does not exist" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.783184 4883 scope.go:117] "RemoveContainer" containerID="3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.783580 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96\": container with ID starting with 3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96 not found: ID does not exist" containerID="3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.783619 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96"} err="failed to get container status \"3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96\": rpc error: code = NotFound desc = could not find container \"3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96\": container with ID starting with 3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96 not found: ID does not exist" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.783640 4883 scope.go:117] "RemoveContainer" containerID="aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.783909 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830\": container with ID starting with aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830 not found: ID does not exist" containerID="aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.783940 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830"} err="failed to get container status \"aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830\": rpc error: code = NotFound desc = could not find container \"aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830\": container with ID starting with aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830 not found: ID does not exist" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.783954 4883 scope.go:117] "RemoveContainer" containerID="1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.784404 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e\": container with ID starting with 1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e not found: ID does not exist" containerID="1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.784447 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e"} err="failed to get container status \"1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e\": rpc error: code = NotFound desc = could not find container \"1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e\": container with ID starting with 1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e not found: ID does not exist" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908011 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr2bq\" (UniqueName: \"kubernetes.io/projected/ae500ef3-e9e8-490e-863f-7768270829a6-kube-api-access-mr2bq\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908071 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908120 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-log-httpd\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908147 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-run-httpd\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908280 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-scripts\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908530 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-config-data\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908597 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908688 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.010566 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-log-httpd\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.010609 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-run-httpd\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.010666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-scripts\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.011187 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-log-httpd\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.011849 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-run-httpd\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.012003 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-config-data\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.012348 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.012547 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.012726 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr2bq\" (UniqueName: \"kubernetes.io/projected/ae500ef3-e9e8-490e-863f-7768270829a6-kube-api-access-mr2bq\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.012797 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.016336 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-scripts\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.016344 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.017145 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.017320 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-config-data\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.017616 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.031032 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr2bq\" (UniqueName: \"kubernetes.io/projected/ae500ef3-e9e8-490e-863f-7768270829a6-kube-api-access-mr2bq\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.061838 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.095566 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" path="/var/lib/kubelet/pods/13fc6b71-b633-4726-ad0d-91a04b592d3b/volumes" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.096339 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" path="/var/lib/kubelet/pods/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b/volumes" Mar 10 09:25:04 crc kubenswrapper[4883]: W0310 09:25:04.475272 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae500ef3_e9e8_490e_863f_7768270829a6.slice/crio-479fed75a96312b4631f3fcfa8a25bf62900caf4553cdbb06ce55d08736d68c3 WatchSource:0}: Error finding container 479fed75a96312b4631f3fcfa8a25bf62900caf4553cdbb06ce55d08736d68c3: Status 404 returned error can't find the container with id 479fed75a96312b4631f3fcfa8a25bf62900caf4553cdbb06ce55d08736d68c3 Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.479546 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.685237 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerStarted","Data":"479fed75a96312b4631f3fcfa8a25bf62900caf4553cdbb06ce55d08736d68c3"} Mar 10 09:25:05 crc kubenswrapper[4883]: I0310 09:25:05.699592 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerStarted","Data":"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10"} Mar 10 09:25:06 crc kubenswrapper[4883]: I0310 09:25:06.033408 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 09:25:06 crc kubenswrapper[4883]: I0310 09:25:06.033468 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 09:25:06 crc kubenswrapper[4883]: I0310 09:25:06.710585 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerStarted","Data":"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95"} Mar 10 09:25:06 crc kubenswrapper[4883]: I0310 09:25:06.710857 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerStarted","Data":"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4"} Mar 10 09:25:08 crc kubenswrapper[4883]: I0310 09:25:08.052681 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 09:25:08 crc kubenswrapper[4883]: I0310 09:25:08.731186 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerStarted","Data":"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1"} Mar 10 09:25:08 crc kubenswrapper[4883]: I0310 09:25:08.731702 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:25:08 crc kubenswrapper[4883]: I0310 09:25:08.755914 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.912326925 podStartE2EDuration="5.755892238s" podCreationTimestamp="2026-03-10 09:25:03 +0000 UTC" firstStartedPulling="2026-03-10 09:25:04.478511096 +0000 UTC m=+1290.733408985" lastFinishedPulling="2026-03-10 09:25:08.322076409 +0000 UTC m=+1294.576974298" observedRunningTime="2026-03-10 09:25:08.749743553 +0000 UTC m=+1295.004641443" watchObservedRunningTime="2026-03-10 09:25:08.755892238 +0000 UTC m=+1295.010790117" Mar 10 09:25:10 crc kubenswrapper[4883]: I0310 09:25:10.003381 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 09:25:11 crc kubenswrapper[4883]: I0310 09:25:11.033689 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 09:25:11 crc kubenswrapper[4883]: I0310 09:25:11.035302 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 09:25:11 crc kubenswrapper[4883]: I0310 09:25:11.074732 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:25:11 crc kubenswrapper[4883]: I0310 09:25:11.074789 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:25:12 crc kubenswrapper[4883]: I0310 09:25:12.047603 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0743bd84-b1d5-4634-9a7f-2c9daf2a5994" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:25:12 crc kubenswrapper[4883]: I0310 09:25:12.047623 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0743bd84-b1d5-4634-9a7f-2c9daf2a5994" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:25:12 crc kubenswrapper[4883]: I0310 09:25:12.157648 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:25:12 crc kubenswrapper[4883]: I0310 09:25:12.157688 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:25:13 crc kubenswrapper[4883]: I0310 09:25:13.050853 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 09:25:13 crc kubenswrapper[4883]: I0310 09:25:13.076512 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 09:25:13 crc kubenswrapper[4883]: I0310 09:25:13.802448 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.037644 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.038769 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.043748 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.078628 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.079209 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.081164 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.083794 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.863836 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.867876 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.868551 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.020078 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7749c44969-gf7ng"] Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.021454 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.059094 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-gf7ng"] Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.061231 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.061300 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfsjg\" (UniqueName: \"kubernetes.io/projected/3612d60a-476b-48fa-9163-03c2886a64b2-kube-api-access-xfsjg\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.061356 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.061405 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-svc\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.061547 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.061622 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-config\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.163776 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-config\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.163983 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.164017 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfsjg\" (UniqueName: \"kubernetes.io/projected/3612d60a-476b-48fa-9163-03c2886a64b2-kube-api-access-xfsjg\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.164052 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.164079 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-svc\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.164140 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.164726 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-config\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.164885 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.165214 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-svc\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.165454 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.169497 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.202141 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfsjg\" (UniqueName: \"kubernetes.io/projected/3612d60a-476b-48fa-9163-03c2886a64b2-kube-api-access-xfsjg\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.345551 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.761764 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-gf7ng"] Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.873867 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" event={"ID":"3612d60a-476b-48fa-9163-03c2886a64b2","Type":"ContainerStarted","Data":"bc8f46ec7a59322161bc14068b60976298d60f1cfa73a1b7887e26a2e987b797"} Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.800201 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.801454 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-central-agent" containerID="cri-o://05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" gracePeriod=30 Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.801578 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-notification-agent" containerID="cri-o://7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" gracePeriod=30 Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.801511 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="proxy-httpd" containerID="cri-o://da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" gracePeriod=30 Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.801533 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="sg-core" containerID="cri-o://a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" gracePeriod=30 Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.814911 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.211:3000/\": EOF" Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.884093 4883 generic.go:334] "Generic (PLEG): container finished" podID="3612d60a-476b-48fa-9163-03c2886a64b2" containerID="a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae" exitCode=0 Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.884200 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" event={"ID":"3612d60a-476b-48fa-9163-03c2886a64b2","Type":"ContainerDied","Data":"a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.096295 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.632386 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.718847 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-scripts\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.718918 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-sg-core-conf-yaml\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.718951 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr2bq\" (UniqueName: \"kubernetes.io/projected/ae500ef3-e9e8-490e-863f-7768270829a6-kube-api-access-mr2bq\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.719001 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-ceilometer-tls-certs\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.719140 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-combined-ca-bundle\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.719292 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-config-data\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.719321 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-log-httpd\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.719366 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-run-httpd\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.720278 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.720552 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.725549 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-scripts" (OuterVolumeSpecName: "scripts") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.725691 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae500ef3-e9e8-490e-863f-7768270829a6-kube-api-access-mr2bq" (OuterVolumeSpecName: "kube-api-access-mr2bq") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "kube-api-access-mr2bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.742957 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.759129 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.779804 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822616 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822657 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822667 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822680 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822691 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr2bq\" (UniqueName: \"kubernetes.io/projected/ae500ef3-e9e8-490e-863f-7768270829a6-kube-api-access-mr2bq\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822700 4883 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822708 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.828577 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-config-data" (OuterVolumeSpecName: "config-data") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.896403 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" event={"ID":"3612d60a-476b-48fa-9163-03c2886a64b2","Type":"ContainerStarted","Data":"b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.897045 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899588 4883 generic.go:334] "Generic (PLEG): container finished" podID="ae500ef3-e9e8-490e-863f-7768270829a6" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" exitCode=0 Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899623 4883 generic.go:334] "Generic (PLEG): container finished" podID="ae500ef3-e9e8-490e-863f-7768270829a6" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" exitCode=2 Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899657 4883 generic.go:334] "Generic (PLEG): container finished" podID="ae500ef3-e9e8-490e-863f-7768270829a6" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" exitCode=0 Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899666 4883 generic.go:334] "Generic (PLEG): container finished" podID="ae500ef3-e9e8-490e-863f-7768270829a6" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" exitCode=0 Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899671 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899664 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerDied","Data":"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899841 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerDied","Data":"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899865 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerDied","Data":"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899878 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerDied","Data":"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899888 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerDied","Data":"479fed75a96312b4631f3fcfa8a25bf62900caf4553cdbb06ce55d08736d68c3"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899906 4883 scope.go:117] "RemoveContainer" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.900158 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-log" containerID="cri-o://8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6" gracePeriod=30 Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.900413 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-api" containerID="cri-o://4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367" gracePeriod=30 Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.924015 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.927571 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" podStartSLOduration=3.927560458 podStartE2EDuration="3.927560458s" podCreationTimestamp="2026-03-10 09:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:25:24.921468511 +0000 UTC m=+1311.176366410" watchObservedRunningTime="2026-03-10 09:25:24.927560458 +0000 UTC m=+1311.182458347" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.929656 4883 scope.go:117] "RemoveContainer" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.949110 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.950371 4883 scope.go:117] "RemoveContainer" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.961796 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.972890 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:24 crc kubenswrapper[4883]: E0310 09:25:24.973427 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="sg-core" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973450 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="sg-core" Mar 10 09:25:24 crc kubenswrapper[4883]: E0310 09:25:24.973468 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="proxy-httpd" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973491 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="proxy-httpd" Mar 10 09:25:24 crc kubenswrapper[4883]: E0310 09:25:24.973511 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-notification-agent" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973518 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-notification-agent" Mar 10 09:25:24 crc kubenswrapper[4883]: E0310 09:25:24.973528 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-central-agent" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973534 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-central-agent" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973776 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="proxy-httpd" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973798 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-notification-agent" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973807 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="sg-core" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973823 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-central-agent" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.975644 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.976629 4883 scope.go:117] "RemoveContainer" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.981336 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.981681 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.982006 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.991906 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.999780 4883 scope.go:117] "RemoveContainer" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" Mar 10 09:25:25 crc kubenswrapper[4883]: E0310 09:25:25.000429 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": container with ID starting with da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1 not found: ID does not exist" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.000462 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1"} err="failed to get container status \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": rpc error: code = NotFound desc = could not find container \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": container with ID starting with da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.000603 4883 scope.go:117] "RemoveContainer" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" Mar 10 09:25:25 crc kubenswrapper[4883]: E0310 09:25:25.001036 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": container with ID starting with a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95 not found: ID does not exist" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001071 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95"} err="failed to get container status \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": rpc error: code = NotFound desc = could not find container \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": container with ID starting with a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001097 4883 scope.go:117] "RemoveContainer" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" Mar 10 09:25:25 crc kubenswrapper[4883]: E0310 09:25:25.001373 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": container with ID starting with 7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4 not found: ID does not exist" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001410 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4"} err="failed to get container status \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": rpc error: code = NotFound desc = could not find container \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": container with ID starting with 7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001437 4883 scope.go:117] "RemoveContainer" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" Mar 10 09:25:25 crc kubenswrapper[4883]: E0310 09:25:25.001730 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": container with ID starting with 05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10 not found: ID does not exist" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001754 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10"} err="failed to get container status \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": rpc error: code = NotFound desc = could not find container \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": container with ID starting with 05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001767 4883 scope.go:117] "RemoveContainer" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001956 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1"} err="failed to get container status \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": rpc error: code = NotFound desc = could not find container \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": container with ID starting with da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001989 4883 scope.go:117] "RemoveContainer" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.002947 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95"} err="failed to get container status \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": rpc error: code = NotFound desc = could not find container \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": container with ID starting with a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.002969 4883 scope.go:117] "RemoveContainer" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.003174 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4"} err="failed to get container status \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": rpc error: code = NotFound desc = could not find container \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": container with ID starting with 7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.003202 4883 scope.go:117] "RemoveContainer" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.003407 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10"} err="failed to get container status \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": rpc error: code = NotFound desc = could not find container \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": container with ID starting with 05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.003425 4883 scope.go:117] "RemoveContainer" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.004639 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1"} err="failed to get container status \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": rpc error: code = NotFound desc = could not find container \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": container with ID starting with da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.004664 4883 scope.go:117] "RemoveContainer" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.012528 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95"} err="failed to get container status \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": rpc error: code = NotFound desc = could not find container \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": container with ID starting with a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.012563 4883 scope.go:117] "RemoveContainer" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.012941 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4"} err="failed to get container status \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": rpc error: code = NotFound desc = could not find container \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": container with ID starting with 7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.012988 4883 scope.go:117] "RemoveContainer" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.017053 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10"} err="failed to get container status \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": rpc error: code = NotFound desc = could not find container \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": container with ID starting with 05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.017098 4883 scope.go:117] "RemoveContainer" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.017551 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1"} err="failed to get container status \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": rpc error: code = NotFound desc = could not find container \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": container with ID starting with da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.017581 4883 scope.go:117] "RemoveContainer" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.017863 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95"} err="failed to get container status \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": rpc error: code = NotFound desc = could not find container \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": container with ID starting with a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.017886 4883 scope.go:117] "RemoveContainer" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.018287 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4"} err="failed to get container status \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": rpc error: code = NotFound desc = could not find container \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": container with ID starting with 7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.018335 4883 scope.go:117] "RemoveContainer" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.018737 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10"} err="failed to get container status \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": rpc error: code = NotFound desc = could not find container \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": container with ID starting with 05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.128654 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwpj6\" (UniqueName: \"kubernetes.io/projected/718b65b8-f5c7-4933-945a-8e5e5dea72a4-kube-api-access-vwpj6\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.128806 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.128847 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-config-data\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.128933 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-log-httpd\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.128968 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-scripts\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.128999 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-run-httpd\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.129033 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.129216 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.231598 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232378 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232465 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwpj6\" (UniqueName: \"kubernetes.io/projected/718b65b8-f5c7-4933-945a-8e5e5dea72a4-kube-api-access-vwpj6\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232562 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232594 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-config-data\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232668 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-log-httpd\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232693 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-scripts\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232719 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-run-httpd\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.233295 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-run-httpd\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.234723 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-log-httpd\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.239179 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.239369 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.240073 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-config-data\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.240210 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-scripts\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.244797 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.249141 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwpj6\" (UniqueName: \"kubernetes.io/projected/718b65b8-f5c7-4933-945a-8e5e5dea72a4-kube-api-access-vwpj6\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.305670 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.542078 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.712077 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:25 crc kubenswrapper[4883]: W0310 09:25:25.713687 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod718b65b8_f5c7_4933_945a_8e5e5dea72a4.slice/crio-348833b96142a07cd2b1850468f3df69af27cea75ca70ec73df2e8a23d17baed WatchSource:0}: Error finding container 348833b96142a07cd2b1850468f3df69af27cea75ca70ec73df2e8a23d17baed: Status 404 returned error can't find the container with id 348833b96142a07cd2b1850468f3df69af27cea75ca70ec73df2e8a23d17baed Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.915331 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f69269f-4be5-4302-b2ad-8f38012ef305","Type":"ContainerDied","Data":"8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6"} Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.915228 4883 generic.go:334] "Generic (PLEG): container finished" podID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerID="8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6" exitCode=143 Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.917625 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerStarted","Data":"348833b96142a07cd2b1850468f3df69af27cea75ca70ec73df2e8a23d17baed"} Mar 10 09:25:26 crc kubenswrapper[4883]: I0310 09:25:26.104908 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" path="/var/lib/kubelet/pods/ae500ef3-e9e8-490e-863f-7768270829a6/volumes" Mar 10 09:25:26 crc kubenswrapper[4883]: I0310 09:25:26.927505 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerStarted","Data":"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642"} Mar 10 09:25:27 crc kubenswrapper[4883]: I0310 09:25:27.943584 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerStarted","Data":"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04"} Mar 10 09:25:27 crc kubenswrapper[4883]: I0310 09:25:27.943860 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerStarted","Data":"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be"} Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.443573 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.608869 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-config-data\") pod \"6f69269f-4be5-4302-b2ad-8f38012ef305\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.609019 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-combined-ca-bundle\") pod \"6f69269f-4be5-4302-b2ad-8f38012ef305\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.609071 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f69269f-4be5-4302-b2ad-8f38012ef305-logs\") pod \"6f69269f-4be5-4302-b2ad-8f38012ef305\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.609236 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb2dl\" (UniqueName: \"kubernetes.io/projected/6f69269f-4be5-4302-b2ad-8f38012ef305-kube-api-access-cb2dl\") pod \"6f69269f-4be5-4302-b2ad-8f38012ef305\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.611167 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f69269f-4be5-4302-b2ad-8f38012ef305-logs" (OuterVolumeSpecName: "logs") pod "6f69269f-4be5-4302-b2ad-8f38012ef305" (UID: "6f69269f-4be5-4302-b2ad-8f38012ef305"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.614133 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f69269f-4be5-4302-b2ad-8f38012ef305-kube-api-access-cb2dl" (OuterVolumeSpecName: "kube-api-access-cb2dl") pod "6f69269f-4be5-4302-b2ad-8f38012ef305" (UID: "6f69269f-4be5-4302-b2ad-8f38012ef305"). InnerVolumeSpecName "kube-api-access-cb2dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.637824 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-config-data" (OuterVolumeSpecName: "config-data") pod "6f69269f-4be5-4302-b2ad-8f38012ef305" (UID: "6f69269f-4be5-4302-b2ad-8f38012ef305"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.649954 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f69269f-4be5-4302-b2ad-8f38012ef305" (UID: "6f69269f-4be5-4302-b2ad-8f38012ef305"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.711546 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.711826 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f69269f-4be5-4302-b2ad-8f38012ef305-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.711836 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb2dl\" (UniqueName: \"kubernetes.io/projected/6f69269f-4be5-4302-b2ad-8f38012ef305-kube-api-access-cb2dl\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.711848 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.953973 4883 generic.go:334] "Generic (PLEG): container finished" podID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerID="4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367" exitCode=0 Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.954025 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f69269f-4be5-4302-b2ad-8f38012ef305","Type":"ContainerDied","Data":"4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367"} Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.954031 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.954056 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f69269f-4be5-4302-b2ad-8f38012ef305","Type":"ContainerDied","Data":"f6397c2d10b284f9002d1ac8c830d0d1865e88294fe5c8f541bafc5bd527b7d0"} Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.954074 4883 scope.go:117] "RemoveContainer" containerID="4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.978267 4883 scope.go:117] "RemoveContainer" containerID="8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.992625 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.997240 4883 scope.go:117] "RemoveContainer" containerID="4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367" Mar 10 09:25:28 crc kubenswrapper[4883]: E0310 09:25:28.997871 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367\": container with ID starting with 4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367 not found: ID does not exist" containerID="4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.997957 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367"} err="failed to get container status \"4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367\": rpc error: code = NotFound desc = could not find container \"4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367\": container with ID starting with 4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367 not found: ID does not exist" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.998008 4883 scope.go:117] "RemoveContainer" containerID="8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6" Mar 10 09:25:28 crc kubenswrapper[4883]: E0310 09:25:28.998527 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6\": container with ID starting with 8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6 not found: ID does not exist" containerID="8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.998572 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6"} err="failed to get container status \"8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6\": rpc error: code = NotFound desc = could not find container \"8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6\": container with ID starting with 8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6 not found: ID does not exist" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.000928 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.017492 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:29 crc kubenswrapper[4883]: E0310 09:25:29.017935 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-api" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.017958 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-api" Mar 10 09:25:29 crc kubenswrapper[4883]: E0310 09:25:29.018004 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-log" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.018011 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-log" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.018221 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-api" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.018244 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-log" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.019230 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.021786 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.022001 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.022379 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.028753 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.119554 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-internal-tls-certs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.119714 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-config-data\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.119781 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-public-tls-certs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.119906 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ae000e-33d5-4caa-8b61-dd1ab03b9978-logs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.119981 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.120050 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9l5c\" (UniqueName: \"kubernetes.io/projected/14ae000e-33d5-4caa-8b61-dd1ab03b9978-kube-api-access-s9l5c\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.221583 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-config-data\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.221645 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-public-tls-certs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.221690 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ae000e-33d5-4caa-8b61-dd1ab03b9978-logs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.221734 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.221767 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9l5c\" (UniqueName: \"kubernetes.io/projected/14ae000e-33d5-4caa-8b61-dd1ab03b9978-kube-api-access-s9l5c\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.221824 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-internal-tls-certs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.222850 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ae000e-33d5-4caa-8b61-dd1ab03b9978-logs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.227944 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-internal-tls-certs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.228020 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.228426 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-config-data\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.228852 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-public-tls-certs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.236768 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9l5c\" (UniqueName: \"kubernetes.io/projected/14ae000e-33d5-4caa-8b61-dd1ab03b9978-kube-api-access-s9l5c\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.340105 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.751552 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:29 crc kubenswrapper[4883]: W0310 09:25:29.757876 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14ae000e_33d5_4caa_8b61_dd1ab03b9978.slice/crio-fb95f7414f2896871a87b668df7d8fd697323d11035390d802f4e5f55c74059c WatchSource:0}: Error finding container fb95f7414f2896871a87b668df7d8fd697323d11035390d802f4e5f55c74059c: Status 404 returned error can't find the container with id fb95f7414f2896871a87b668df7d8fd697323d11035390d802f4e5f55c74059c Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.965547 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14ae000e-33d5-4caa-8b61-dd1ab03b9978","Type":"ContainerStarted","Data":"4c319601427e6a2871510f3669bb92ab1cb581950f5e3a928817e32ff7fa92f1"} Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.965917 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14ae000e-33d5-4caa-8b61-dd1ab03b9978","Type":"ContainerStarted","Data":"fb95f7414f2896871a87b668df7d8fd697323d11035390d802f4e5f55c74059c"} Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.095854 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" path="/var/lib/kubelet/pods/6f69269f-4be5-4302-b2ad-8f38012ef305/volumes" Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.979126 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14ae000e-33d5-4caa-8b61-dd1ab03b9978","Type":"ContainerStarted","Data":"d42705f13688b92d2af86867c60cdd3f936e4cc75d857d04e5a1caf06f5d4373"} Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.982347 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerStarted","Data":"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f"} Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.982624 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-central-agent" containerID="cri-o://54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" gracePeriod=30 Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.982669 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="sg-core" containerID="cri-o://40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" gracePeriod=30 Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.982723 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="proxy-httpd" containerID="cri-o://d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" gracePeriod=30 Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.982705 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.982725 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-notification-agent" containerID="cri-o://f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" gracePeriod=30 Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.998245 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.998235251 podStartE2EDuration="2.998235251s" podCreationTimestamp="2026-03-10 09:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:25:30.994542958 +0000 UTC m=+1317.249440847" watchObservedRunningTime="2026-03-10 09:25:30.998235251 +0000 UTC m=+1317.253133140" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.019141 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.059719354 podStartE2EDuration="7.019132866s" podCreationTimestamp="2026-03-10 09:25:24 +0000 UTC" firstStartedPulling="2026-03-10 09:25:25.716865229 +0000 UTC m=+1311.971763117" lastFinishedPulling="2026-03-10 09:25:30.676278751 +0000 UTC m=+1316.931176629" observedRunningTime="2026-03-10 09:25:31.010905423 +0000 UTC m=+1317.265803322" watchObservedRunningTime="2026-03-10 09:25:31.019132866 +0000 UTC m=+1317.274030756" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.831954 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.978751 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-log-httpd\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.978805 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-sg-core-conf-yaml\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.978883 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-scripts\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.978997 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-ceilometer-tls-certs\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.979082 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-combined-ca-bundle\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.979191 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwpj6\" (UniqueName: \"kubernetes.io/projected/718b65b8-f5c7-4933-945a-8e5e5dea72a4-kube-api-access-vwpj6\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.979322 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-config-data\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.979354 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-run-httpd\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.980092 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.980219 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.985157 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-scripts" (OuterVolumeSpecName: "scripts") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.985736 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/718b65b8-f5c7-4933-945a-8e5e5dea72a4-kube-api-access-vwpj6" (OuterVolumeSpecName: "kube-api-access-vwpj6") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "kube-api-access-vwpj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999274 4883 generic.go:334] "Generic (PLEG): container finished" podID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" exitCode=0 Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999336 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999341 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerDied","Data":"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f"} Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999414 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerDied","Data":"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04"} Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999370 4883 generic.go:334] "Generic (PLEG): container finished" podID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" exitCode=2 Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999453 4883 generic.go:334] "Generic (PLEG): container finished" podID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" exitCode=0 Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999487 4883 generic.go:334] "Generic (PLEG): container finished" podID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" exitCode=0 Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999505 4883 scope.go:117] "RemoveContainer" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999696 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerDied","Data":"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be"} Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999743 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerDied","Data":"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642"} Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999763 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerDied","Data":"348833b96142a07cd2b1850468f3df69af27cea75ca70ec73df2e8a23d17baed"} Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.006952 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.026390 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.042297 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.055340 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-config-data" (OuterVolumeSpecName: "config-data") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.075035 4883 scope.go:117] "RemoveContainer" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082465 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwpj6\" (UniqueName: \"kubernetes.io/projected/718b65b8-f5c7-4933-945a-8e5e5dea72a4-kube-api-access-vwpj6\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082512 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082524 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082534 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082546 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082556 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082566 4883 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082575 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.094729 4883 scope.go:117] "RemoveContainer" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.117665 4883 scope.go:117] "RemoveContainer" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.143890 4883 scope.go:117] "RemoveContainer" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.144243 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": container with ID starting with d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f not found: ID does not exist" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.144278 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f"} err="failed to get container status \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": rpc error: code = NotFound desc = could not find container \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": container with ID starting with d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.144305 4883 scope.go:117] "RemoveContainer" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.144583 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": container with ID starting with 40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04 not found: ID does not exist" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.144605 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04"} err="failed to get container status \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": rpc error: code = NotFound desc = could not find container \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": container with ID starting with 40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.144619 4883 scope.go:117] "RemoveContainer" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.144895 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": container with ID starting with f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be not found: ID does not exist" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.144915 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be"} err="failed to get container status \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": rpc error: code = NotFound desc = could not find container \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": container with ID starting with f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.144930 4883 scope.go:117] "RemoveContainer" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.145166 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": container with ID starting with 54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642 not found: ID does not exist" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.145189 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642"} err="failed to get container status \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": rpc error: code = NotFound desc = could not find container \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": container with ID starting with 54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.145205 4883 scope.go:117] "RemoveContainer" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.145671 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f"} err="failed to get container status \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": rpc error: code = NotFound desc = could not find container \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": container with ID starting with d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.145694 4883 scope.go:117] "RemoveContainer" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.145948 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04"} err="failed to get container status \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": rpc error: code = NotFound desc = could not find container \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": container with ID starting with 40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.145969 4883 scope.go:117] "RemoveContainer" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.146190 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be"} err="failed to get container status \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": rpc error: code = NotFound desc = could not find container \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": container with ID starting with f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.146209 4883 scope.go:117] "RemoveContainer" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.146408 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642"} err="failed to get container status \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": rpc error: code = NotFound desc = could not find container \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": container with ID starting with 54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.146629 4883 scope.go:117] "RemoveContainer" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.146883 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f"} err="failed to get container status \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": rpc error: code = NotFound desc = could not find container \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": container with ID starting with d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.146902 4883 scope.go:117] "RemoveContainer" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.147098 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04"} err="failed to get container status \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": rpc error: code = NotFound desc = could not find container \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": container with ID starting with 40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.147116 4883 scope.go:117] "RemoveContainer" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.147344 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be"} err="failed to get container status \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": rpc error: code = NotFound desc = could not find container \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": container with ID starting with f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.147362 4883 scope.go:117] "RemoveContainer" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.147702 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642"} err="failed to get container status \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": rpc error: code = NotFound desc = could not find container \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": container with ID starting with 54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.147750 4883 scope.go:117] "RemoveContainer" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148022 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f"} err="failed to get container status \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": rpc error: code = NotFound desc = could not find container \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": container with ID starting with d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148046 4883 scope.go:117] "RemoveContainer" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148250 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04"} err="failed to get container status \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": rpc error: code = NotFound desc = could not find container \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": container with ID starting with 40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148270 4883 scope.go:117] "RemoveContainer" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148445 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be"} err="failed to get container status \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": rpc error: code = NotFound desc = could not find container \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": container with ID starting with f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148464 4883 scope.go:117] "RemoveContainer" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148662 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642"} err="failed to get container status \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": rpc error: code = NotFound desc = could not find container \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": container with ID starting with 54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.326982 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.333697 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.347448 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.352777 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.353316 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="proxy-httpd" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353336 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="proxy-httpd" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.353387 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-central-agent" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353394 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-central-agent" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.353410 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-notification-agent" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353417 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-notification-agent" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.353458 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="sg-core" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353465 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="sg-core" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353704 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-notification-agent" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353746 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="proxy-httpd" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353759 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="sg-core" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353769 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-central-agent" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.355856 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.357798 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.358117 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.358383 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.373190 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.411067 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-sqzrs"] Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.411327 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerName="dnsmasq-dns" containerID="cri-o://8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867" gracePeriod=10 Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.492919 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.492977 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.493019 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-config-data\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.493646 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.493811 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-scripts\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.493877 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.493930 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.494123 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqldd\" (UniqueName: \"kubernetes.io/projected/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-kube-api-access-kqldd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.596616 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597128 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597176 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-config-data\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597296 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597328 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-scripts\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597358 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597390 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597437 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqldd\" (UniqueName: \"kubernetes.io/projected/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-kube-api-access-kqldd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597070 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.598879 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.604256 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-scripts\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.606962 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.608764 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-config-data\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.609331 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.618728 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqldd\" (UniqueName: \"kubernetes.io/projected/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-kube-api-access-kqldd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.628823 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.670664 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.895902 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.010825 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-svc\") pod \"abc326c8-0db0-4645-b1dc-3871b1b4202c\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.010961 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-swift-storage-0\") pod \"abc326c8-0db0-4645-b1dc-3871b1b4202c\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.011025 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-nb\") pod \"abc326c8-0db0-4645-b1dc-3871b1b4202c\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.011157 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-sb\") pod \"abc326c8-0db0-4645-b1dc-3871b1b4202c\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.011197 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-config\") pod \"abc326c8-0db0-4645-b1dc-3871b1b4202c\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.011739 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrr79\" (UniqueName: \"kubernetes.io/projected/abc326c8-0db0-4645-b1dc-3871b1b4202c-kube-api-access-hrr79\") pod \"abc326c8-0db0-4645-b1dc-3871b1b4202c\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.016686 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc326c8-0db0-4645-b1dc-3871b1b4202c-kube-api-access-hrr79" (OuterVolumeSpecName: "kube-api-access-hrr79") pod "abc326c8-0db0-4645-b1dc-3871b1b4202c" (UID: "abc326c8-0db0-4645-b1dc-3871b1b4202c"). InnerVolumeSpecName "kube-api-access-hrr79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.016741 4883 generic.go:334] "Generic (PLEG): container finished" podID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerID="8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867" exitCode=0 Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.016802 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" event={"ID":"abc326c8-0db0-4645-b1dc-3871b1b4202c","Type":"ContainerDied","Data":"8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867"} Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.016836 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" event={"ID":"abc326c8-0db0-4645-b1dc-3871b1b4202c","Type":"ContainerDied","Data":"48534b3ef2da468c813fcefb6a12bb7f5e9a292a4eb15ad03c6e477d12cc26e8"} Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.016846 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.016862 4883 scope.go:117] "RemoveContainer" containerID="8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.050157 4883 scope.go:117] "RemoveContainer" containerID="419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.054028 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "abc326c8-0db0-4645-b1dc-3871b1b4202c" (UID: "abc326c8-0db0-4645-b1dc-3871b1b4202c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.060406 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "abc326c8-0db0-4645-b1dc-3871b1b4202c" (UID: "abc326c8-0db0-4645-b1dc-3871b1b4202c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.060902 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-config" (OuterVolumeSpecName: "config") pod "abc326c8-0db0-4645-b1dc-3871b1b4202c" (UID: "abc326c8-0db0-4645-b1dc-3871b1b4202c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.061668 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "abc326c8-0db0-4645-b1dc-3871b1b4202c" (UID: "abc326c8-0db0-4645-b1dc-3871b1b4202c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.064525 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "abc326c8-0db0-4645-b1dc-3871b1b4202c" (UID: "abc326c8-0db0-4645-b1dc-3871b1b4202c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.078999 4883 scope.go:117] "RemoveContainer" containerID="8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867" Mar 10 09:25:33 crc kubenswrapper[4883]: E0310 09:25:33.079751 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867\": container with ID starting with 8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867 not found: ID does not exist" containerID="8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.079794 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867"} err="failed to get container status \"8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867\": rpc error: code = NotFound desc = could not find container \"8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867\": container with ID starting with 8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867 not found: ID does not exist" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.079821 4883 scope.go:117] "RemoveContainer" containerID="419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8" Mar 10 09:25:33 crc kubenswrapper[4883]: E0310 09:25:33.080062 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8\": container with ID starting with 419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8 not found: ID does not exist" containerID="419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.080085 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8"} err="failed to get container status \"419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8\": rpc error: code = NotFound desc = could not find container \"419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8\": container with ID starting with 419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8 not found: ID does not exist" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.118855 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.118892 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.118906 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.118917 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrr79\" (UniqueName: \"kubernetes.io/projected/abc326c8-0db0-4645-b1dc-3871b1b4202c-kube-api-access-hrr79\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.118931 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.118941 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.141506 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:33 crc kubenswrapper[4883]: W0310 09:25:33.143117 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0819f125_35db_4a0e_8fff_c1d3d3a27ae7.slice/crio-3665a6e9a45a41d5d26fb264a72f8e2eb825426ad9a5ff0aa026b9163c9f8251 WatchSource:0}: Error finding container 3665a6e9a45a41d5d26fb264a72f8e2eb825426ad9a5ff0aa026b9163c9f8251: Status 404 returned error can't find the container with id 3665a6e9a45a41d5d26fb264a72f8e2eb825426ad9a5ff0aa026b9163c9f8251 Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.353168 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-sqzrs"] Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.362581 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-sqzrs"] Mar 10 09:25:34 crc kubenswrapper[4883]: I0310 09:25:34.028734 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0819f125-35db-4a0e-8fff-c1d3d3a27ae7","Type":"ContainerStarted","Data":"6b425537c6e483b6f3eaf025c0fafabce4e535c71aad18b280b99b81166e3680"} Mar 10 09:25:34 crc kubenswrapper[4883]: I0310 09:25:34.029141 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0819f125-35db-4a0e-8fff-c1d3d3a27ae7","Type":"ContainerStarted","Data":"3665a6e9a45a41d5d26fb264a72f8e2eb825426ad9a5ff0aa026b9163c9f8251"} Mar 10 09:25:34 crc kubenswrapper[4883]: I0310 09:25:34.094929 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" path="/var/lib/kubelet/pods/718b65b8-f5c7-4933-945a-8e5e5dea72a4/volumes" Mar 10 09:25:34 crc kubenswrapper[4883]: I0310 09:25:34.095735 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" path="/var/lib/kubelet/pods/abc326c8-0db0-4645-b1dc-3871b1b4202c/volumes" Mar 10 09:25:35 crc kubenswrapper[4883]: I0310 09:25:35.045763 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0819f125-35db-4a0e-8fff-c1d3d3a27ae7","Type":"ContainerStarted","Data":"8722d4d9e78b4e3fdd7d4d7e1afff9fe658e01839db036dd1e53cf059aff5e0c"} Mar 10 09:25:36 crc kubenswrapper[4883]: I0310 09:25:36.055863 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0819f125-35db-4a0e-8fff-c1d3d3a27ae7","Type":"ContainerStarted","Data":"1a0d31cb723db77d80365324a3a9f5216de56fece3784158f2f794688c8a4f83"} Mar 10 09:25:39 crc kubenswrapper[4883]: I0310 09:25:39.107929 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0819f125-35db-4a0e-8fff-c1d3d3a27ae7","Type":"ContainerStarted","Data":"3eb92d4db72f0dfd8ed05d7c20385851473d8ece9cf57b03c7a2ca0ddd17c0c7"} Mar 10 09:25:39 crc kubenswrapper[4883]: I0310 09:25:39.108458 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:25:39 crc kubenswrapper[4883]: I0310 09:25:39.133884 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.023845289 podStartE2EDuration="7.133859322s" podCreationTimestamp="2026-03-10 09:25:32 +0000 UTC" firstStartedPulling="2026-03-10 09:25:33.145955698 +0000 UTC m=+1319.400853588" lastFinishedPulling="2026-03-10 09:25:38.255969732 +0000 UTC m=+1324.510867621" observedRunningTime="2026-03-10 09:25:39.123833577 +0000 UTC m=+1325.378731466" watchObservedRunningTime="2026-03-10 09:25:39.133859322 +0000 UTC m=+1325.388757211" Mar 10 09:25:39 crc kubenswrapper[4883]: I0310 09:25:39.341164 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:25:39 crc kubenswrapper[4883]: I0310 09:25:39.341445 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:25:40 crc kubenswrapper[4883]: I0310 09:25:40.356622 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="14ae000e-33d5-4caa-8b61-dd1ab03b9978" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:25:40 crc kubenswrapper[4883]: I0310 09:25:40.356666 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="14ae000e-33d5-4caa-8b61-dd1ab03b9978" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.561536 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bcdxb"] Mar 10 09:25:45 crc kubenswrapper[4883]: E0310 09:25:45.562416 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerName="dnsmasq-dns" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.562440 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerName="dnsmasq-dns" Mar 10 09:25:45 crc kubenswrapper[4883]: E0310 09:25:45.562451 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerName="init" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.562456 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerName="init" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.562665 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerName="dnsmasq-dns" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.563815 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.574562 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcdxb"] Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.581770 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-catalog-content\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.581890 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-utilities\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.581963 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjb5f\" (UniqueName: \"kubernetes.io/projected/4a7004ef-7b97-475f-8801-f2097208978d-kube-api-access-pjb5f\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.683517 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-utilities\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.683668 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjb5f\" (UniqueName: \"kubernetes.io/projected/4a7004ef-7b97-475f-8801-f2097208978d-kube-api-access-pjb5f\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.683748 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-catalog-content\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.683934 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-utilities\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.684140 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-catalog-content\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.705219 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjb5f\" (UniqueName: \"kubernetes.io/projected/4a7004ef-7b97-475f-8801-f2097208978d-kube-api-access-pjb5f\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.881659 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:46 crc kubenswrapper[4883]: I0310 09:25:46.288077 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcdxb"] Mar 10 09:25:46 crc kubenswrapper[4883]: W0310 09:25:46.294807 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a7004ef_7b97_475f_8801_f2097208978d.slice/crio-03dbd0a754bf7b7da3b2a64b926cb788cb06ea3fe8bc3915b47fa5a8d6035ecc WatchSource:0}: Error finding container 03dbd0a754bf7b7da3b2a64b926cb788cb06ea3fe8bc3915b47fa5a8d6035ecc: Status 404 returned error can't find the container with id 03dbd0a754bf7b7da3b2a64b926cb788cb06ea3fe8bc3915b47fa5a8d6035ecc Mar 10 09:25:47 crc kubenswrapper[4883]: I0310 09:25:47.206145 4883 generic.go:334] "Generic (PLEG): container finished" podID="4a7004ef-7b97-475f-8801-f2097208978d" containerID="83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b" exitCode=0 Mar 10 09:25:47 crc kubenswrapper[4883]: I0310 09:25:47.206599 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcdxb" event={"ID":"4a7004ef-7b97-475f-8801-f2097208978d","Type":"ContainerDied","Data":"83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b"} Mar 10 09:25:47 crc kubenswrapper[4883]: I0310 09:25:47.206654 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcdxb" event={"ID":"4a7004ef-7b97-475f-8801-f2097208978d","Type":"ContainerStarted","Data":"03dbd0a754bf7b7da3b2a64b926cb788cb06ea3fe8bc3915b47fa5a8d6035ecc"} Mar 10 09:25:47 crc kubenswrapper[4883]: I0310 09:25:47.448690 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:25:47 crc kubenswrapper[4883]: I0310 09:25:47.448767 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:25:49 crc kubenswrapper[4883]: I0310 09:25:49.230192 4883 generic.go:334] "Generic (PLEG): container finished" podID="4a7004ef-7b97-475f-8801-f2097208978d" containerID="d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255" exitCode=0 Mar 10 09:25:49 crc kubenswrapper[4883]: I0310 09:25:49.230319 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcdxb" event={"ID":"4a7004ef-7b97-475f-8801-f2097208978d","Type":"ContainerDied","Data":"d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255"} Mar 10 09:25:49 crc kubenswrapper[4883]: I0310 09:25:49.347954 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 09:25:49 crc kubenswrapper[4883]: I0310 09:25:49.348656 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 09:25:49 crc kubenswrapper[4883]: I0310 09:25:49.354039 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 09:25:49 crc kubenswrapper[4883]: I0310 09:25:49.355245 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 09:25:50 crc kubenswrapper[4883]: I0310 09:25:50.245014 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcdxb" event={"ID":"4a7004ef-7b97-475f-8801-f2097208978d","Type":"ContainerStarted","Data":"f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f"} Mar 10 09:25:50 crc kubenswrapper[4883]: I0310 09:25:50.245391 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 09:25:50 crc kubenswrapper[4883]: I0310 09:25:50.251631 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 09:25:50 crc kubenswrapper[4883]: I0310 09:25:50.270838 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bcdxb" podStartSLOduration=2.78584722 podStartE2EDuration="5.270812196s" podCreationTimestamp="2026-03-10 09:25:45 +0000 UTC" firstStartedPulling="2026-03-10 09:25:47.20958738 +0000 UTC m=+1333.464485270" lastFinishedPulling="2026-03-10 09:25:49.694552356 +0000 UTC m=+1335.949450246" observedRunningTime="2026-03-10 09:25:50.262707364 +0000 UTC m=+1336.517605253" watchObservedRunningTime="2026-03-10 09:25:50.270812196 +0000 UTC m=+1336.525710085" Mar 10 09:25:55 crc kubenswrapper[4883]: I0310 09:25:55.882121 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:55 crc kubenswrapper[4883]: I0310 09:25:55.882701 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:55 crc kubenswrapper[4883]: I0310 09:25:55.926034 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:56 crc kubenswrapper[4883]: I0310 09:25:56.343817 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:56 crc kubenswrapper[4883]: I0310 09:25:56.385671 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcdxb"] Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.317255 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bcdxb" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="registry-server" containerID="cri-o://f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f" gracePeriod=2 Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.720282 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.841331 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-catalog-content\") pod \"4a7004ef-7b97-475f-8801-f2097208978d\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.841731 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjb5f\" (UniqueName: \"kubernetes.io/projected/4a7004ef-7b97-475f-8801-f2097208978d-kube-api-access-pjb5f\") pod \"4a7004ef-7b97-475f-8801-f2097208978d\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.842024 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-utilities\") pod \"4a7004ef-7b97-475f-8801-f2097208978d\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.842840 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-utilities" (OuterVolumeSpecName: "utilities") pod "4a7004ef-7b97-475f-8801-f2097208978d" (UID: "4a7004ef-7b97-475f-8801-f2097208978d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.859976 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a7004ef-7b97-475f-8801-f2097208978d-kube-api-access-pjb5f" (OuterVolumeSpecName: "kube-api-access-pjb5f") pod "4a7004ef-7b97-475f-8801-f2097208978d" (UID: "4a7004ef-7b97-475f-8801-f2097208978d"). InnerVolumeSpecName "kube-api-access-pjb5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.864821 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a7004ef-7b97-475f-8801-f2097208978d" (UID: "4a7004ef-7b97-475f-8801-f2097208978d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.943964 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.944089 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.944167 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjb5f\" (UniqueName: \"kubernetes.io/projected/4a7004ef-7b97-475f-8801-f2097208978d-kube-api-access-pjb5f\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.327915 4883 generic.go:334] "Generic (PLEG): container finished" podID="4a7004ef-7b97-475f-8801-f2097208978d" containerID="f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f" exitCode=0 Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.327986 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.327983 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcdxb" event={"ID":"4a7004ef-7b97-475f-8801-f2097208978d","Type":"ContainerDied","Data":"f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f"} Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.328063 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcdxb" event={"ID":"4a7004ef-7b97-475f-8801-f2097208978d","Type":"ContainerDied","Data":"03dbd0a754bf7b7da3b2a64b926cb788cb06ea3fe8bc3915b47fa5a8d6035ecc"} Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.328097 4883 scope.go:117] "RemoveContainer" containerID="f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.353756 4883 scope.go:117] "RemoveContainer" containerID="d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.365773 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcdxb"] Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.372832 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcdxb"] Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.375383 4883 scope.go:117] "RemoveContainer" containerID="83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.408534 4883 scope.go:117] "RemoveContainer" containerID="f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f" Mar 10 09:25:59 crc kubenswrapper[4883]: E0310 09:25:59.408915 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f\": container with ID starting with f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f not found: ID does not exist" containerID="f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.408947 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f"} err="failed to get container status \"f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f\": rpc error: code = NotFound desc = could not find container \"f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f\": container with ID starting with f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f not found: ID does not exist" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.408971 4883 scope.go:117] "RemoveContainer" containerID="d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255" Mar 10 09:25:59 crc kubenswrapper[4883]: E0310 09:25:59.409213 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255\": container with ID starting with d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255 not found: ID does not exist" containerID="d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.409237 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255"} err="failed to get container status \"d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255\": rpc error: code = NotFound desc = could not find container \"d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255\": container with ID starting with d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255 not found: ID does not exist" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.409251 4883 scope.go:117] "RemoveContainer" containerID="83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b" Mar 10 09:25:59 crc kubenswrapper[4883]: E0310 09:25:59.410221 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b\": container with ID starting with 83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b not found: ID does not exist" containerID="83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.410277 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b"} err="failed to get container status \"83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b\": rpc error: code = NotFound desc = could not find container \"83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b\": container with ID starting with 83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b not found: ID does not exist" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.090579 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a7004ef-7b97-475f-8801-f2097208978d" path="/var/lib/kubelet/pods/4a7004ef-7b97-475f-8801-f2097208978d/volumes" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.138126 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552246-r4bxm"] Mar 10 09:26:00 crc kubenswrapper[4883]: E0310 09:26:00.138560 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="extract-content" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.138576 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="extract-content" Mar 10 09:26:00 crc kubenswrapper[4883]: E0310 09:26:00.138597 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="registry-server" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.138603 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="registry-server" Mar 10 09:26:00 crc kubenswrapper[4883]: E0310 09:26:00.138613 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="extract-utilities" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.138619 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="extract-utilities" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.138779 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="registry-server" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.139369 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.141041 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.141804 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.142287 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.143351 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552246-r4bxm"] Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.271903 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzbd6\" (UniqueName: \"kubernetes.io/projected/bcfbbeba-ae1f-4e53-ba68-3cc981395803-kube-api-access-hzbd6\") pod \"auto-csr-approver-29552246-r4bxm\" (UID: \"bcfbbeba-ae1f-4e53-ba68-3cc981395803\") " pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.374370 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzbd6\" (UniqueName: \"kubernetes.io/projected/bcfbbeba-ae1f-4e53-ba68-3cc981395803-kube-api-access-hzbd6\") pod \"auto-csr-approver-29552246-r4bxm\" (UID: \"bcfbbeba-ae1f-4e53-ba68-3cc981395803\") " pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.393773 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzbd6\" (UniqueName: \"kubernetes.io/projected/bcfbbeba-ae1f-4e53-ba68-3cc981395803-kube-api-access-hzbd6\") pod \"auto-csr-approver-29552246-r4bxm\" (UID: \"bcfbbeba-ae1f-4e53-ba68-3cc981395803\") " pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.453228 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.846232 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552246-r4bxm"] Mar 10 09:26:01 crc kubenswrapper[4883]: I0310 09:26:01.352330 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" event={"ID":"bcfbbeba-ae1f-4e53-ba68-3cc981395803","Type":"ContainerStarted","Data":"25dd0b5e43e6c881252f792f5c474af9b797127e8b5b57afb6f82eacfccdfa31"} Mar 10 09:26:02 crc kubenswrapper[4883]: I0310 09:26:02.363718 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" event={"ID":"bcfbbeba-ae1f-4e53-ba68-3cc981395803","Type":"ContainerStarted","Data":"f847a3856c40dae8ee7ad4ac93ecaf18ffea7cc99907753e085972a5a353218f"} Mar 10 09:26:02 crc kubenswrapper[4883]: I0310 09:26:02.384449 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" podStartSLOduration=1.308223498 podStartE2EDuration="2.384421491s" podCreationTimestamp="2026-03-10 09:26:00 +0000 UTC" firstStartedPulling="2026-03-10 09:26:00.850493987 +0000 UTC m=+1347.105391876" lastFinishedPulling="2026-03-10 09:26:01.92669198 +0000 UTC m=+1348.181589869" observedRunningTime="2026-03-10 09:26:02.376770233 +0000 UTC m=+1348.631668123" watchObservedRunningTime="2026-03-10 09:26:02.384421491 +0000 UTC m=+1348.639319380" Mar 10 09:26:02 crc kubenswrapper[4883]: I0310 09:26:02.683695 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 09:26:03 crc kubenswrapper[4883]: I0310 09:26:03.377036 4883 generic.go:334] "Generic (PLEG): container finished" podID="bcfbbeba-ae1f-4e53-ba68-3cc981395803" containerID="f847a3856c40dae8ee7ad4ac93ecaf18ffea7cc99907753e085972a5a353218f" exitCode=0 Mar 10 09:26:03 crc kubenswrapper[4883]: I0310 09:26:03.377093 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" event={"ID":"bcfbbeba-ae1f-4e53-ba68-3cc981395803","Type":"ContainerDied","Data":"f847a3856c40dae8ee7ad4ac93ecaf18ffea7cc99907753e085972a5a353218f"} Mar 10 09:26:04 crc kubenswrapper[4883]: I0310 09:26:04.698800 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:04 crc kubenswrapper[4883]: I0310 09:26:04.867827 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzbd6\" (UniqueName: \"kubernetes.io/projected/bcfbbeba-ae1f-4e53-ba68-3cc981395803-kube-api-access-hzbd6\") pod \"bcfbbeba-ae1f-4e53-ba68-3cc981395803\" (UID: \"bcfbbeba-ae1f-4e53-ba68-3cc981395803\") " Mar 10 09:26:04 crc kubenswrapper[4883]: I0310 09:26:04.878577 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcfbbeba-ae1f-4e53-ba68-3cc981395803-kube-api-access-hzbd6" (OuterVolumeSpecName: "kube-api-access-hzbd6") pod "bcfbbeba-ae1f-4e53-ba68-3cc981395803" (UID: "bcfbbeba-ae1f-4e53-ba68-3cc981395803"). InnerVolumeSpecName "kube-api-access-hzbd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:04 crc kubenswrapper[4883]: I0310 09:26:04.971564 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzbd6\" (UniqueName: \"kubernetes.io/projected/bcfbbeba-ae1f-4e53-ba68-3cc981395803-kube-api-access-hzbd6\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:05 crc kubenswrapper[4883]: I0310 09:26:05.396041 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" event={"ID":"bcfbbeba-ae1f-4e53-ba68-3cc981395803","Type":"ContainerDied","Data":"25dd0b5e43e6c881252f792f5c474af9b797127e8b5b57afb6f82eacfccdfa31"} Mar 10 09:26:05 crc kubenswrapper[4883]: I0310 09:26:05.396523 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25dd0b5e43e6c881252f792f5c474af9b797127e8b5b57afb6f82eacfccdfa31" Mar 10 09:26:05 crc kubenswrapper[4883]: I0310 09:26:05.396162 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:05 crc kubenswrapper[4883]: I0310 09:26:05.436617 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552240-29nz4"] Mar 10 09:26:05 crc kubenswrapper[4883]: I0310 09:26:05.442207 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552240-29nz4"] Mar 10 09:26:06 crc kubenswrapper[4883]: I0310 09:26:06.090859 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed80b911-07e4-45b8-9324-dfdf65e5a508" path="/var/lib/kubelet/pods/ed80b911-07e4-45b8-9324-dfdf65e5a508/volumes" Mar 10 09:26:10 crc kubenswrapper[4883]: I0310 09:26:10.697268 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:26:12 crc kubenswrapper[4883]: I0310 09:26:12.240289 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:26:15 crc kubenswrapper[4883]: I0310 09:26:15.227338 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="rabbitmq" containerID="cri-o://6b536f8286ba8f67ed7799e690ba32685de9ac7a70392f9e12efea3132a20a4e" gracePeriod=604796 Mar 10 09:26:16 crc kubenswrapper[4883]: I0310 09:26:16.087631 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="rabbitmq" containerID="cri-o://554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5" gracePeriod=604797 Mar 10 09:26:17 crc kubenswrapper[4883]: I0310 09:26:17.449629 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:26:17 crc kubenswrapper[4883]: I0310 09:26:17.450023 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:26:19 crc kubenswrapper[4883]: I0310 09:26:19.697591 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Mar 10 09:26:19 crc kubenswrapper[4883]: I0310 09:26:19.973547 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.544124 4883 generic.go:334] "Generic (PLEG): container finished" podID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerID="6b536f8286ba8f67ed7799e690ba32685de9ac7a70392f9e12efea3132a20a4e" exitCode=0 Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.544228 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdb6ba72-d1c8-4022-9029-2e18784e1139","Type":"ContainerDied","Data":"6b536f8286ba8f67ed7799e690ba32685de9ac7a70392f9e12efea3132a20a4e"} Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.803095 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817407 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdb6ba72-d1c8-4022-9029-2e18784e1139-pod-info\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817580 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-config-data\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817687 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-plugins\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817793 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh4vm\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-kube-api-access-nh4vm\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817818 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-tls\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817890 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-erlang-cookie\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817907 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817953 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdb6ba72-d1c8-4022-9029-2e18784e1139-erlang-cookie-secret\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817986 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-server-conf\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.818004 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-plugins-conf\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.818034 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-confd\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.824226 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.826039 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.826424 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.834626 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-kube-api-access-nh4vm" (OuterVolumeSpecName: "kube-api-access-nh4vm") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "kube-api-access-nh4vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.843624 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cdb6ba72-d1c8-4022-9029-2e18784e1139-pod-info" (OuterVolumeSpecName: "pod-info") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.865672 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.865820 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.872683 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb6ba72-d1c8-4022-9029-2e18784e1139-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.919678 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.919809 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh4vm\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-kube-api-access-nh4vm\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.919869 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.919919 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.919978 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.920058 4883 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdb6ba72-d1c8-4022-9029-2e18784e1139-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.920116 4883 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.920170 4883 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdb6ba72-d1c8-4022-9029-2e18784e1139-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.926927 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-config-data" (OuterVolumeSpecName: "config-data") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.960879 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.980042 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-server-conf" (OuterVolumeSpecName: "server-conf") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.012715 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.022031 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.022060 4883 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.022073 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.022083 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.226386 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-4wrl2"] Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.226820 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="setup-container" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.226836 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="setup-container" Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.226856 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="rabbitmq" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.226863 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="rabbitmq" Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.226882 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfbbeba-ae1f-4e53-ba68-3cc981395803" containerName="oc" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.226888 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfbbeba-ae1f-4e53-ba68-3cc981395803" containerName="oc" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.227083 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="rabbitmq" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.227102 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcfbbeba-ae1f-4e53-ba68-3cc981395803" containerName="oc" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.228068 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.236229 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.247297 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-4wrl2"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328384 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvtdq\" (UniqueName: \"kubernetes.io/projected/c97c631a-70a2-4d82-87de-84f1d8eecc19-kube-api-access-dvtdq\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328688 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-nb\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328720 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-sb\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328809 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-openstack-edpm-ipam\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328828 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-swift-storage-0\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328901 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-svc\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328933 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-config\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.430365 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-openstack-edpm-ipam\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.430417 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-swift-storage-0\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.430480 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-svc\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431347 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-swift-storage-0\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431392 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-config\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431395 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-openstack-edpm-ipam\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431556 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-svc\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431574 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvtdq\" (UniqueName: \"kubernetes.io/projected/c97c631a-70a2-4d82-87de-84f1d8eecc19-kube-api-access-dvtdq\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431629 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-nb\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431688 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-sb\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.432016 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-config\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.432338 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-nb\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.432413 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-sb\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.455066 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvtdq\" (UniqueName: \"kubernetes.io/projected/c97c631a-70a2-4d82-87de-84f1d8eecc19-kube-api-access-dvtdq\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.545823 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.553080 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdb6ba72-d1c8-4022-9029-2e18784e1139","Type":"ContainerDied","Data":"531195218a3dcad57b9a11db1f5738cf463317c7f3a59428a1b0f404415ec848"} Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.553134 4883 scope.go:117] "RemoveContainer" containerID="6b536f8286ba8f67ed7799e690ba32685de9ac7a70392f9e12efea3132a20a4e" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.553238 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.555396 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.556910 4883 generic.go:334] "Generic (PLEG): container finished" podID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerID="554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5" exitCode=0 Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.556944 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.556958 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07","Type":"ContainerDied","Data":"554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5"} Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.557017 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07","Type":"ContainerDied","Data":"010dc7b31b375a13abbbdc15bf6ef1b807f020341c6f04245119794152aa7af6"} Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.612319 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.619520 4883 scope.go:117] "RemoveContainer" containerID="cd676384987aa2244a851651d9f3c8c0df5750259fc99e409d5e9d090a18495f" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.621018 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.626850 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.627285 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="rabbitmq" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.627299 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="rabbitmq" Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.627313 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="setup-container" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.627319 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="setup-container" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.628027 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="rabbitmq" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.628963 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.635886 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.636224 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.636358 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.636411 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.636979 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.637066 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-x4lhh" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.637173 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.658433 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.694182 4883 scope.go:117] "RemoveContainer" containerID="554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.714723 4883 scope.go:117] "RemoveContainer" containerID="8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738122 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-confd\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738161 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-plugins-conf\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738288 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-erlang-cookie-secret\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738308 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-erlang-cookie\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738357 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-server-conf\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738403 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnprs\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-kube-api-access-dnprs\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738505 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-tls\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738526 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738546 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-config-data\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738600 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-pod-info\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738657 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-plugins\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738984 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739076 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739097 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739131 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ssc\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-kube-api-access-v8ssc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739151 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1be05788-71cf-486a-8142-e317e959bfe9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739197 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739230 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-config-data\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739246 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739399 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739415 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1be05788-71cf-486a-8142-e317e959bfe9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739429 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.740072 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.740678 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.741352 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.743701 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-pod-info" (OuterVolumeSpecName: "pod-info") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.747984 4883 scope.go:117] "RemoveContainer" containerID="554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.749659 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.752321 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.752416 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-kube-api-access-dnprs" (OuterVolumeSpecName: "kube-api-access-dnprs") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "kube-api-access-dnprs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.752631 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5\": container with ID starting with 554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5 not found: ID does not exist" containerID="554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.752800 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5"} err="failed to get container status \"554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5\": rpc error: code = NotFound desc = could not find container \"554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5\": container with ID starting with 554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5 not found: ID does not exist" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.753110 4883 scope.go:117] "RemoveContainer" containerID="8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5" Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.756573 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5\": container with ID starting with 8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5 not found: ID does not exist" containerID="8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.756624 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5"} err="failed to get container status \"8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5\": rpc error: code = NotFound desc = could not find container \"8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5\": container with ID starting with 8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5 not found: ID does not exist" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.760698 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.783976 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-server-conf" (OuterVolumeSpecName: "server-conf") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.786739 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-config-data" (OuterVolumeSpecName: "config-data") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.826371 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.841820 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.841899 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.841920 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.841951 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ssc\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-kube-api-access-v8ssc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.841968 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1be05788-71cf-486a-8142-e317e959bfe9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.841988 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842011 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-config-data\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842027 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842096 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842118 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1be05788-71cf-486a-8142-e317e959bfe9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842134 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842183 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnprs\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-kube-api-access-dnprs\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842195 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842215 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842338 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842393 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842465 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842775 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843066 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-config-data\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843108 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843228 4883 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843249 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843262 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843274 4883 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843287 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843296 4883 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843305 4883 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843746 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.844395 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.844835 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1be05788-71cf-486a-8142-e317e959bfe9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.846439 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1be05788-71cf-486a-8142-e317e959bfe9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.846686 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.856565 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ssc\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-kube-api-access-v8ssc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.858850 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.867680 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.899459 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.906961 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.926281 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.929142 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.933376 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cjf6k" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.933677 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.933829 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.933967 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.934376 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.934428 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.934686 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.939927 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.945696 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.992334 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.045578 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-4wrl2"] Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.047463 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.047769 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.047806 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.047889 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.047908 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/170c41ad-d10f-4567-97ec-2b90d149951b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.048038 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.048080 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.048130 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/170c41ad-d10f-4567-97ec-2b90d149951b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.048326 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.048377 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.048400 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkflt\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-kube-api-access-dkflt\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150289 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150546 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150574 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150618 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150635 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/170c41ad-d10f-4567-97ec-2b90d149951b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150680 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150702 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150731 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/170c41ad-d10f-4567-97ec-2b90d149951b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150779 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150799 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150816 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkflt\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-kube-api-access-dkflt\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.151261 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.151936 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.152671 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.154160 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.154458 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.154841 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.155031 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/170c41ad-d10f-4567-97ec-2b90d149951b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.156753 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.159064 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/170c41ad-d10f-4567-97ec-2b90d149951b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.160259 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.164774 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkflt\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-kube-api-access-dkflt\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.187389 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.253948 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.390450 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.573841 4883 generic.go:334] "Generic (PLEG): container finished" podID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerID="1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed" exitCode=0 Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.573977 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" event={"ID":"c97c631a-70a2-4d82-87de-84f1d8eecc19","Type":"ContainerDied","Data":"1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed"} Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.574070 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" event={"ID":"c97c631a-70a2-4d82-87de-84f1d8eecc19","Type":"ContainerStarted","Data":"4031676c69275321eb3f6b90e5c6639caa836d24393c80f802273273e9c5edd8"} Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.578987 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1be05788-71cf-486a-8142-e317e959bfe9","Type":"ContainerStarted","Data":"cb7c2c99cd3ab9abab6d15ca157de5d75a91b7b7d0f2bf51780a524cc1b59cca"} Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.662946 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:26:23 crc kubenswrapper[4883]: W0310 09:26:23.669451 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod170c41ad_d10f_4567_97ec_2b90d149951b.slice/crio-8c88563f9f486927cbf6d91d4fc5131e21074808b77904a07bd82caddf9954cc WatchSource:0}: Error finding container 8c88563f9f486927cbf6d91d4fc5131e21074808b77904a07bd82caddf9954cc: Status 404 returned error can't find the container with id 8c88563f9f486927cbf6d91d4fc5131e21074808b77904a07bd82caddf9954cc Mar 10 09:26:24 crc kubenswrapper[4883]: I0310 09:26:24.088448 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" path="/var/lib/kubelet/pods/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07/volumes" Mar 10 09:26:24 crc kubenswrapper[4883]: I0310 09:26:24.089643 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" path="/var/lib/kubelet/pods/cdb6ba72-d1c8-4022-9029-2e18784e1139/volumes" Mar 10 09:26:24 crc kubenswrapper[4883]: I0310 09:26:24.590985 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"170c41ad-d10f-4567-97ec-2b90d149951b","Type":"ContainerStarted","Data":"8c88563f9f486927cbf6d91d4fc5131e21074808b77904a07bd82caddf9954cc"} Mar 10 09:26:24 crc kubenswrapper[4883]: I0310 09:26:24.593769 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" event={"ID":"c97c631a-70a2-4d82-87de-84f1d8eecc19","Type":"ContainerStarted","Data":"ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b"} Mar 10 09:26:24 crc kubenswrapper[4883]: I0310 09:26:24.594028 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:24 crc kubenswrapper[4883]: I0310 09:26:24.613372 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" podStartSLOduration=2.613359538 podStartE2EDuration="2.613359538s" podCreationTimestamp="2026-03-10 09:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:26:24.61029995 +0000 UTC m=+1370.865197839" watchObservedRunningTime="2026-03-10 09:26:24.613359538 +0000 UTC m=+1370.868257428" Mar 10 09:26:25 crc kubenswrapper[4883]: I0310 09:26:25.602628 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1be05788-71cf-486a-8142-e317e959bfe9","Type":"ContainerStarted","Data":"e068ac8b65990b981161335beb6d85601d5b7c6701979113d5246e6885684704"} Mar 10 09:26:25 crc kubenswrapper[4883]: I0310 09:26:25.604631 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"170c41ad-d10f-4567-97ec-2b90d149951b","Type":"ContainerStarted","Data":"40718267ba86811b3e6ea0c37d5a13d023e91c52a4ea68253cd8019fec0ff03c"} Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.556732 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.620510 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-gf7ng"] Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.620962 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" containerName="dnsmasq-dns" containerID="cri-o://b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31" gracePeriod=10 Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.709269 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79fcc958f9-rx8gc"] Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.711642 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.724249 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79fcc958f9-rx8gc"] Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.856321 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-openstack-edpm-ipam\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.856370 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-ovsdbserver-sb\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.856396 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-ovsdbserver-nb\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.856427 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-dns-swift-storage-0\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.856959 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-config\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.857279 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-dns-svc\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.857337 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgldt\" (UniqueName: \"kubernetes.io/projected/da34e0af-a084-40fb-93ea-471923c49051-kube-api-access-pgldt\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.959945 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-openstack-edpm-ipam\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960013 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-ovsdbserver-sb\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960040 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-ovsdbserver-nb\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960069 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-dns-swift-storage-0\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960138 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-config\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960189 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-dns-svc\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960223 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgldt\" (UniqueName: \"kubernetes.io/projected/da34e0af-a084-40fb-93ea-471923c49051-kube-api-access-pgldt\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960862 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-openstack-edpm-ipam\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960974 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-ovsdbserver-nb\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.961369 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-ovsdbserver-sb\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.961537 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-config\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.961822 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-dns-svc\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.962083 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-dns-swift-storage-0\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.981307 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgldt\" (UniqueName: \"kubernetes.io/projected/da34e0af-a084-40fb-93ea-471923c49051-kube-api-access-pgldt\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.060591 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.061265 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.165245 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-sb\") pod \"3612d60a-476b-48fa-9163-03c2886a64b2\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.165291 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-swift-storage-0\") pod \"3612d60a-476b-48fa-9163-03c2886a64b2\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.165467 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfsjg\" (UniqueName: \"kubernetes.io/projected/3612d60a-476b-48fa-9163-03c2886a64b2-kube-api-access-xfsjg\") pod \"3612d60a-476b-48fa-9163-03c2886a64b2\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.165519 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-svc\") pod \"3612d60a-476b-48fa-9163-03c2886a64b2\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.165617 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-config\") pod \"3612d60a-476b-48fa-9163-03c2886a64b2\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.165707 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-nb\") pod \"3612d60a-476b-48fa-9163-03c2886a64b2\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.173857 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3612d60a-476b-48fa-9163-03c2886a64b2-kube-api-access-xfsjg" (OuterVolumeSpecName: "kube-api-access-xfsjg") pod "3612d60a-476b-48fa-9163-03c2886a64b2" (UID: "3612d60a-476b-48fa-9163-03c2886a64b2"). InnerVolumeSpecName "kube-api-access-xfsjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.204356 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3612d60a-476b-48fa-9163-03c2886a64b2" (UID: "3612d60a-476b-48fa-9163-03c2886a64b2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.216016 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-config" (OuterVolumeSpecName: "config") pod "3612d60a-476b-48fa-9163-03c2886a64b2" (UID: "3612d60a-476b-48fa-9163-03c2886a64b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.216126 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3612d60a-476b-48fa-9163-03c2886a64b2" (UID: "3612d60a-476b-48fa-9163-03c2886a64b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.216804 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3612d60a-476b-48fa-9163-03c2886a64b2" (UID: "3612d60a-476b-48fa-9163-03c2886a64b2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.218194 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3612d60a-476b-48fa-9163-03c2886a64b2" (UID: "3612d60a-476b-48fa-9163-03c2886a64b2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.271665 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.271702 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.271716 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.271732 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.271744 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.271756 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfsjg\" (UniqueName: \"kubernetes.io/projected/3612d60a-476b-48fa-9163-03c2886a64b2-kube-api-access-xfsjg\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.481037 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79fcc958f9-rx8gc"] Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.676872 4883 generic.go:334] "Generic (PLEG): container finished" podID="da34e0af-a084-40fb-93ea-471923c49051" containerID="febc316b4590ba1f2958d9ee63f57bb0d076198c0d409531bb6e58901795ea3f" exitCode=0 Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.676990 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" event={"ID":"da34e0af-a084-40fb-93ea-471923c49051","Type":"ContainerDied","Data":"febc316b4590ba1f2958d9ee63f57bb0d076198c0d409531bb6e58901795ea3f"} Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.677090 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" event={"ID":"da34e0af-a084-40fb-93ea-471923c49051","Type":"ContainerStarted","Data":"bf6932deb94311a9f7a44cfb9b41801e5ea7f1f3e46d931fa9a611d58802bf48"} Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.679951 4883 generic.go:334] "Generic (PLEG): container finished" podID="3612d60a-476b-48fa-9163-03c2886a64b2" containerID="b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31" exitCode=0 Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.679994 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" event={"ID":"3612d60a-476b-48fa-9163-03c2886a64b2","Type":"ContainerDied","Data":"b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31"} Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.680029 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" event={"ID":"3612d60a-476b-48fa-9163-03c2886a64b2","Type":"ContainerDied","Data":"bc8f46ec7a59322161bc14068b60976298d60f1cfa73a1b7887e26a2e987b797"} Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.680041 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.680076 4883 scope.go:117] "RemoveContainer" containerID="b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.735179 4883 scope.go:117] "RemoveContainer" containerID="a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.740395 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-gf7ng"] Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.747686 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-gf7ng"] Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.838521 4883 scope.go:117] "RemoveContainer" containerID="b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31" Mar 10 09:26:33 crc kubenswrapper[4883]: E0310 09:26:33.839257 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31\": container with ID starting with b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31 not found: ID does not exist" containerID="b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.839304 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31"} err="failed to get container status \"b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31\": rpc error: code = NotFound desc = could not find container \"b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31\": container with ID starting with b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31 not found: ID does not exist" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.839334 4883 scope.go:117] "RemoveContainer" containerID="a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae" Mar 10 09:26:33 crc kubenswrapper[4883]: E0310 09:26:33.840588 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae\": container with ID starting with a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae not found: ID does not exist" containerID="a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.840625 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae"} err="failed to get container status \"a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae\": rpc error: code = NotFound desc = could not find container \"a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae\": container with ID starting with a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae not found: ID does not exist" Mar 10 09:26:34 crc kubenswrapper[4883]: I0310 09:26:34.098898 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" path="/var/lib/kubelet/pods/3612d60a-476b-48fa-9163-03c2886a64b2/volumes" Mar 10 09:26:34 crc kubenswrapper[4883]: I0310 09:26:34.688785 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" event={"ID":"da34e0af-a084-40fb-93ea-471923c49051","Type":"ContainerStarted","Data":"89f8f38f554753f54883adeb6ee9e04e6496380ec4f61aab451d400118355efd"} Mar 10 09:26:34 crc kubenswrapper[4883]: I0310 09:26:34.688931 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:34 crc kubenswrapper[4883]: I0310 09:26:34.710583 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" podStartSLOduration=2.710566527 podStartE2EDuration="2.710566527s" podCreationTimestamp="2026-03-10 09:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:26:34.703419851 +0000 UTC m=+1380.958317741" watchObservedRunningTime="2026-03-10 09:26:34.710566527 +0000 UTC m=+1380.965464416" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.062731 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.124071 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-4wrl2"] Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.124529 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerName="dnsmasq-dns" containerID="cri-o://ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b" gracePeriod=10 Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.532570 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.686571 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-openstack-edpm-ipam\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.687100 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-svc\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.687299 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-sb\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.687772 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-swift-storage-0\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.687908 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvtdq\" (UniqueName: \"kubernetes.io/projected/c97c631a-70a2-4d82-87de-84f1d8eecc19-kube-api-access-dvtdq\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.687991 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-nb\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.688107 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-config\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.694672 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c97c631a-70a2-4d82-87de-84f1d8eecc19-kube-api-access-dvtdq" (OuterVolumeSpecName: "kube-api-access-dvtdq") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "kube-api-access-dvtdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.730349 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.731735 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.732663 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.735574 4883 generic.go:334] "Generic (PLEG): container finished" podID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerID="ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b" exitCode=0 Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.735628 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" event={"ID":"c97c631a-70a2-4d82-87de-84f1d8eecc19","Type":"ContainerDied","Data":"ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b"} Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.735684 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" event={"ID":"c97c631a-70a2-4d82-87de-84f1d8eecc19","Type":"ContainerDied","Data":"4031676c69275321eb3f6b90e5c6639caa836d24393c80f802273273e9c5edd8"} Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.735704 4883 scope.go:117] "RemoveContainer" containerID="ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.735716 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.736646 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-config" (OuterVolumeSpecName: "config") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.737695 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.738382 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792017 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792049 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvtdq\" (UniqueName: \"kubernetes.io/projected/c97c631a-70a2-4d82-87de-84f1d8eecc19-kube-api-access-dvtdq\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792061 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792075 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792085 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792094 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792103 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.795811 4883 scope.go:117] "RemoveContainer" containerID="1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.820355 4883 scope.go:117] "RemoveContainer" containerID="ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b" Mar 10 09:26:38 crc kubenswrapper[4883]: E0310 09:26:38.820739 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b\": container with ID starting with ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b not found: ID does not exist" containerID="ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.820771 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b"} err="failed to get container status \"ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b\": rpc error: code = NotFound desc = could not find container \"ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b\": container with ID starting with ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b not found: ID does not exist" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.820796 4883 scope.go:117] "RemoveContainer" containerID="1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed" Mar 10 09:26:38 crc kubenswrapper[4883]: E0310 09:26:38.821150 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed\": container with ID starting with 1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed not found: ID does not exist" containerID="1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.821189 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed"} err="failed to get container status \"1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed\": rpc error: code = NotFound desc = could not find container \"1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed\": container with ID starting with 1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed not found: ID does not exist" Mar 10 09:26:39 crc kubenswrapper[4883]: I0310 09:26:39.067827 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-4wrl2"] Mar 10 09:26:39 crc kubenswrapper[4883]: I0310 09:26:39.077216 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-4wrl2"] Mar 10 09:26:40 crc kubenswrapper[4883]: I0310 09:26:40.092165 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" path="/var/lib/kubelet/pods/c97c631a-70a2-4d82-87de-84f1d8eecc19/volumes" Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.449395 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.450049 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.450106 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.450715 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3173da1ffe8f7435768fc32262b7cd69b641562a52821e6d42f580eafd7ff225"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.450773 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://3173da1ffe8f7435768fc32262b7cd69b641562a52821e6d42f580eafd7ff225" gracePeriod=600 Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.823209 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="3173da1ffe8f7435768fc32262b7cd69b641562a52821e6d42f580eafd7ff225" exitCode=0 Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.823282 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"3173da1ffe8f7435768fc32262b7cd69b641562a52821e6d42f580eafd7ff225"} Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.823518 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b"} Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.823542 4883 scope.go:117] "RemoveContainer" containerID="baa5421b80377ba5964153032e8442f67866ecaee882a6a6d6389530ab32d1e3" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.748087 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7"] Mar 10 09:26:51 crc kubenswrapper[4883]: E0310 09:26:51.755879 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerName="dnsmasq-dns" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.755895 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerName="dnsmasq-dns" Mar 10 09:26:51 crc kubenswrapper[4883]: E0310 09:26:51.755911 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerName="init" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.755916 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerName="init" Mar 10 09:26:51 crc kubenswrapper[4883]: E0310 09:26:51.755938 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" containerName="dnsmasq-dns" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.755944 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" containerName="dnsmasq-dns" Mar 10 09:26:51 crc kubenswrapper[4883]: E0310 09:26:51.755959 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" containerName="init" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.755965 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" containerName="init" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.756111 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" containerName="dnsmasq-dns" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.756128 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerName="dnsmasq-dns" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.756616 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7"] Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.756707 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.758843 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.759063 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.760813 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.760905 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.864156 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdf25\" (UniqueName: \"kubernetes.io/projected/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-kube-api-access-qdf25\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.864376 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.864639 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.864691 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.966547 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.966877 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.967046 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdf25\" (UniqueName: \"kubernetes.io/projected/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-kube-api-access-qdf25\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.967267 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.973460 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.973964 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.974260 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.981098 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdf25\" (UniqueName: \"kubernetes.io/projected/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-kube-api-access-qdf25\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:52 crc kubenswrapper[4883]: I0310 09:26:52.070830 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:52 crc kubenswrapper[4883]: I0310 09:26:52.553779 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7"] Mar 10 09:26:52 crc kubenswrapper[4883]: I0310 09:26:52.912760 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" event={"ID":"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9","Type":"ContainerStarted","Data":"fca1a18a7932a35b0c2408ef7f3f6f3577a7605fbf02ca9e7d5424a6dc0043be"} Mar 10 09:26:56 crc kubenswrapper[4883]: I0310 09:26:56.954067 4883 generic.go:334] "Generic (PLEG): container finished" podID="1be05788-71cf-486a-8142-e317e959bfe9" containerID="e068ac8b65990b981161335beb6d85601d5b7c6701979113d5246e6885684704" exitCode=0 Mar 10 09:26:56 crc kubenswrapper[4883]: I0310 09:26:56.954170 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1be05788-71cf-486a-8142-e317e959bfe9","Type":"ContainerDied","Data":"e068ac8b65990b981161335beb6d85601d5b7c6701979113d5246e6885684704"} Mar 10 09:26:56 crc kubenswrapper[4883]: I0310 09:26:56.960318 4883 generic.go:334] "Generic (PLEG): container finished" podID="170c41ad-d10f-4567-97ec-2b90d149951b" containerID="40718267ba86811b3e6ea0c37d5a13d023e91c52a4ea68253cd8019fec0ff03c" exitCode=0 Mar 10 09:26:56 crc kubenswrapper[4883]: I0310 09:26:56.960405 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"170c41ad-d10f-4567-97ec-2b90d149951b","Type":"ContainerDied","Data":"40718267ba86811b3e6ea0c37d5a13d023e91c52a4ea68253cd8019fec0ff03c"} Mar 10 09:26:57 crc kubenswrapper[4883]: I0310 09:26:57.975355 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"170c41ad-d10f-4567-97ec-2b90d149951b","Type":"ContainerStarted","Data":"30c55dba4871c6db10a361237096387128c315da62812b4241a38b5f16d1fdab"} Mar 10 09:26:57 crc kubenswrapper[4883]: I0310 09:26:57.976774 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:57 crc kubenswrapper[4883]: I0310 09:26:57.980626 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1be05788-71cf-486a-8142-e317e959bfe9","Type":"ContainerStarted","Data":"d20fb133c019c39c2aef7a4f1f09bf5b11372d75dcd388ad04aab69eaa5a8b62"} Mar 10 09:26:57 crc kubenswrapper[4883]: I0310 09:26:57.980863 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 09:26:58 crc kubenswrapper[4883]: I0310 09:26:58.007526 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.007508786 podStartE2EDuration="36.007508786s" podCreationTimestamp="2026-03-10 09:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:26:57.99510652 +0000 UTC m=+1404.250004408" watchObservedRunningTime="2026-03-10 09:26:58.007508786 +0000 UTC m=+1404.262406674" Mar 10 09:26:58 crc kubenswrapper[4883]: I0310 09:26:58.021093 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.02107607 podStartE2EDuration="36.02107607s" podCreationTimestamp="2026-03-10 09:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:26:58.018611844 +0000 UTC m=+1404.273509732" watchObservedRunningTime="2026-03-10 09:26:58.02107607 +0000 UTC m=+1404.275973959" Mar 10 09:27:00 crc kubenswrapper[4883]: I0310 09:27:00.559144 4883 scope.go:117] "RemoveContainer" containerID="7f58830e2474379901c1e5ffc8128426c4b2e0d855c9a6d20d780eb3f2958fa5" Mar 10 09:27:04 crc kubenswrapper[4883]: I0310 09:27:04.050551 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" event={"ID":"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9","Type":"ContainerStarted","Data":"fcf4883556625c19a1455d845635bedb77c5247e864a5aa77d1d91768b67866b"} Mar 10 09:27:04 crc kubenswrapper[4883]: I0310 09:27:04.069872 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" podStartSLOduration=1.905657462 podStartE2EDuration="13.06983589s" podCreationTimestamp="2026-03-10 09:26:51 +0000 UTC" firstStartedPulling="2026-03-10 09:26:52.560778998 +0000 UTC m=+1398.815676887" lastFinishedPulling="2026-03-10 09:27:03.724957426 +0000 UTC m=+1409.979855315" observedRunningTime="2026-03-10 09:27:04.067753535 +0000 UTC m=+1410.322651423" watchObservedRunningTime="2026-03-10 09:27:04.06983589 +0000 UTC m=+1410.324733780" Mar 10 09:27:12 crc kubenswrapper[4883]: I0310 09:27:12.995661 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 09:27:13 crc kubenswrapper[4883]: I0310 09:27:13.257678 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:27:15 crc kubenswrapper[4883]: I0310 09:27:15.148685 4883 generic.go:334] "Generic (PLEG): container finished" podID="4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" containerID="fcf4883556625c19a1455d845635bedb77c5247e864a5aa77d1d91768b67866b" exitCode=0 Mar 10 09:27:15 crc kubenswrapper[4883]: I0310 09:27:15.148794 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" event={"ID":"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9","Type":"ContainerDied","Data":"fcf4883556625c19a1455d845635bedb77c5247e864a5aa77d1d91768b67866b"} Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.552100 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.689030 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdf25\" (UniqueName: \"kubernetes.io/projected/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-kube-api-access-qdf25\") pod \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.689141 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-ssh-key-openstack-edpm-ipam\") pod \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.689541 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-inventory\") pod \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.689652 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-repo-setup-combined-ca-bundle\") pod \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.695629 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" (UID: "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.696288 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-kube-api-access-qdf25" (OuterVolumeSpecName: "kube-api-access-qdf25") pod "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" (UID: "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9"). InnerVolumeSpecName "kube-api-access-qdf25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.713320 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-inventory" (OuterVolumeSpecName: "inventory") pod "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" (UID: "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.719352 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" (UID: "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.792579 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.792615 4883 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.792630 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdf25\" (UniqueName: \"kubernetes.io/projected/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-kube-api-access-qdf25\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.792639 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.168579 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" event={"ID":"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9","Type":"ContainerDied","Data":"fca1a18a7932a35b0c2408ef7f3f6f3577a7605fbf02ca9e7d5424a6dc0043be"} Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.168959 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fca1a18a7932a35b0c2408ef7f3f6f3577a7605fbf02ca9e7d5424a6dc0043be" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.168622 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.235559 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9"] Mar 10 09:27:17 crc kubenswrapper[4883]: E0310 09:27:17.235920 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.235938 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.236116 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.236663 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.238490 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.238749 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.238946 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.239172 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.249147 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9"] Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.402826 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.402897 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.403029 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c27c\" (UniqueName: \"kubernetes.io/projected/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-kube-api-access-5c27c\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.505091 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c27c\" (UniqueName: \"kubernetes.io/projected/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-kube-api-access-5c27c\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.505175 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.505223 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.511158 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.511914 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.521050 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c27c\" (UniqueName: \"kubernetes.io/projected/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-kube-api-access-5c27c\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.550566 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:18 crc kubenswrapper[4883]: W0310 09:27:18.016748 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3461a81_abbe_4c3e_88ca_42eff1eeb14e.slice/crio-6494c244889c43b9908a2f490da5c27b9aec8165fca3fb1f3a3a6df858600ed3 WatchSource:0}: Error finding container 6494c244889c43b9908a2f490da5c27b9aec8165fca3fb1f3a3a6df858600ed3: Status 404 returned error can't find the container with id 6494c244889c43b9908a2f490da5c27b9aec8165fca3fb1f3a3a6df858600ed3 Mar 10 09:27:18 crc kubenswrapper[4883]: I0310 09:27:18.018175 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9"] Mar 10 09:27:18 crc kubenswrapper[4883]: I0310 09:27:18.187877 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" event={"ID":"d3461a81-abbe-4c3e-88ca-42eff1eeb14e","Type":"ContainerStarted","Data":"6494c244889c43b9908a2f490da5c27b9aec8165fca3fb1f3a3a6df858600ed3"} Mar 10 09:27:19 crc kubenswrapper[4883]: I0310 09:27:19.199408 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" event={"ID":"d3461a81-abbe-4c3e-88ca-42eff1eeb14e","Type":"ContainerStarted","Data":"eb4b3058a2017f66deb4eca98d6a66f7aa07bc8b8282766cfccdcf2f21aeb9bc"} Mar 10 09:27:19 crc kubenswrapper[4883]: I0310 09:27:19.224760 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" podStartSLOduration=1.674391977 podStartE2EDuration="2.224745314s" podCreationTimestamp="2026-03-10 09:27:17 +0000 UTC" firstStartedPulling="2026-03-10 09:27:18.019769242 +0000 UTC m=+1424.274667130" lastFinishedPulling="2026-03-10 09:27:18.570122578 +0000 UTC m=+1424.825020467" observedRunningTime="2026-03-10 09:27:19.213387817 +0000 UTC m=+1425.468285706" watchObservedRunningTime="2026-03-10 09:27:19.224745314 +0000 UTC m=+1425.479643203" Mar 10 09:27:21 crc kubenswrapper[4883]: I0310 09:27:21.218281 4883 generic.go:334] "Generic (PLEG): container finished" podID="d3461a81-abbe-4c3e-88ca-42eff1eeb14e" containerID="eb4b3058a2017f66deb4eca98d6a66f7aa07bc8b8282766cfccdcf2f21aeb9bc" exitCode=0 Mar 10 09:27:21 crc kubenswrapper[4883]: I0310 09:27:21.218358 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" event={"ID":"d3461a81-abbe-4c3e-88ca-42eff1eeb14e","Type":"ContainerDied","Data":"eb4b3058a2017f66deb4eca98d6a66f7aa07bc8b8282766cfccdcf2f21aeb9bc"} Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.586948 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.707839 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c27c\" (UniqueName: \"kubernetes.io/projected/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-kube-api-access-5c27c\") pod \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.707985 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-inventory\") pod \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.708035 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-ssh-key-openstack-edpm-ipam\") pod \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.714692 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-kube-api-access-5c27c" (OuterVolumeSpecName: "kube-api-access-5c27c") pod "d3461a81-abbe-4c3e-88ca-42eff1eeb14e" (UID: "d3461a81-abbe-4c3e-88ca-42eff1eeb14e"). InnerVolumeSpecName "kube-api-access-5c27c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.734213 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-inventory" (OuterVolumeSpecName: "inventory") pod "d3461a81-abbe-4c3e-88ca-42eff1eeb14e" (UID: "d3461a81-abbe-4c3e-88ca-42eff1eeb14e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.736373 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d3461a81-abbe-4c3e-88ca-42eff1eeb14e" (UID: "d3461a81-abbe-4c3e-88ca-42eff1eeb14e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.810759 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.810800 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.810814 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c27c\" (UniqueName: \"kubernetes.io/projected/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-kube-api-access-5c27c\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.240647 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" event={"ID":"d3461a81-abbe-4c3e-88ca-42eff1eeb14e","Type":"ContainerDied","Data":"6494c244889c43b9908a2f490da5c27b9aec8165fca3fb1f3a3a6df858600ed3"} Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.240981 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6494c244889c43b9908a2f490da5c27b9aec8165fca3fb1f3a3a6df858600ed3" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.240740 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.390866 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r"] Mar 10 09:27:23 crc kubenswrapper[4883]: E0310 09:27:23.391295 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3461a81-abbe-4c3e-88ca-42eff1eeb14e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.391314 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3461a81-abbe-4c3e-88ca-42eff1eeb14e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.391502 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3461a81-abbe-4c3e-88ca-42eff1eeb14e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.392188 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.394096 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.396719 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r"] Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.398986 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.399188 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.399336 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.422412 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.422687 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.422869 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.423089 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqgvq\" (UniqueName: \"kubernetes.io/projected/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-kube-api-access-vqgvq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.524013 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqgvq\" (UniqueName: \"kubernetes.io/projected/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-kube-api-access-vqgvq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.524070 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.524140 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.524205 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.529499 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.530213 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.530783 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.538155 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqgvq\" (UniqueName: \"kubernetes.io/projected/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-kube-api-access-vqgvq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.716032 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:24 crc kubenswrapper[4883]: I0310 09:27:24.238707 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r"] Mar 10 09:27:24 crc kubenswrapper[4883]: I0310 09:27:24.253706 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" event={"ID":"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd","Type":"ContainerStarted","Data":"733b4fd44a5ac5d10292b081b7fba6dec6ce4014fc5f18ef316d40cce4e76602"} Mar 10 09:27:25 crc kubenswrapper[4883]: I0310 09:27:25.267753 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" event={"ID":"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd","Type":"ContainerStarted","Data":"5043f618c5751d9b1d780a8c20af9397bbb12825aec770e998b692d0b3a30888"} Mar 10 09:27:25 crc kubenswrapper[4883]: I0310 09:27:25.288337 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" podStartSLOduration=1.804470642 podStartE2EDuration="2.288320412s" podCreationTimestamp="2026-03-10 09:27:23 +0000 UTC" firstStartedPulling="2026-03-10 09:27:24.241953688 +0000 UTC m=+1430.496851577" lastFinishedPulling="2026-03-10 09:27:24.725803458 +0000 UTC m=+1430.980701347" observedRunningTime="2026-03-10 09:27:25.284344394 +0000 UTC m=+1431.539242283" watchObservedRunningTime="2026-03-10 09:27:25.288320412 +0000 UTC m=+1431.543218301" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.078139 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zvb5h"] Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.081613 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.098295 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvb5h"] Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.261219 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-catalog-content\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.261381 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-utilities\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.261404 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrlbz\" (UniqueName: \"kubernetes.io/projected/bcd77dbb-628a-4eb6-9910-6d75adb8025c-kube-api-access-zrlbz\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.363277 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-catalog-content\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.363350 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrlbz\" (UniqueName: \"kubernetes.io/projected/bcd77dbb-628a-4eb6-9910-6d75adb8025c-kube-api-access-zrlbz\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.363374 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-utilities\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.363884 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-catalog-content\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.363930 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-utilities\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.381739 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrlbz\" (UniqueName: \"kubernetes.io/projected/bcd77dbb-628a-4eb6-9910-6d75adb8025c-kube-api-access-zrlbz\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.404747 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.812137 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvb5h"] Mar 10 09:27:48 crc kubenswrapper[4883]: I0310 09:27:48.468298 4883 generic.go:334] "Generic (PLEG): container finished" podID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerID="bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f" exitCode=0 Mar 10 09:27:48 crc kubenswrapper[4883]: I0310 09:27:48.468414 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerDied","Data":"bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f"} Mar 10 09:27:48 crc kubenswrapper[4883]: I0310 09:27:48.468798 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerStarted","Data":"23cbf9373eb80662f59c04d25e32dcc2645848ba76f31abb0fbd36eb95635d39"} Mar 10 09:27:49 crc kubenswrapper[4883]: I0310 09:27:49.482394 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerStarted","Data":"60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1"} Mar 10 09:27:50 crc kubenswrapper[4883]: I0310 09:27:50.494706 4883 generic.go:334] "Generic (PLEG): container finished" podID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerID="60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1" exitCode=0 Mar 10 09:27:50 crc kubenswrapper[4883]: I0310 09:27:50.494775 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerDied","Data":"60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1"} Mar 10 09:27:51 crc kubenswrapper[4883]: I0310 09:27:51.507938 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerStarted","Data":"b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c"} Mar 10 09:27:51 crc kubenswrapper[4883]: I0310 09:27:51.526274 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zvb5h" podStartSLOduration=2.057024177 podStartE2EDuration="4.526255182s" podCreationTimestamp="2026-03-10 09:27:47 +0000 UTC" firstStartedPulling="2026-03-10 09:27:48.470896501 +0000 UTC m=+1454.725794390" lastFinishedPulling="2026-03-10 09:27:50.940127506 +0000 UTC m=+1457.195025395" observedRunningTime="2026-03-10 09:27:51.524441772 +0000 UTC m=+1457.779339661" watchObservedRunningTime="2026-03-10 09:27:51.526255182 +0000 UTC m=+1457.781153071" Mar 10 09:27:57 crc kubenswrapper[4883]: I0310 09:27:57.405421 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:57 crc kubenswrapper[4883]: I0310 09:27:57.406044 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:57 crc kubenswrapper[4883]: I0310 09:27:57.449635 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:57 crc kubenswrapper[4883]: I0310 09:27:57.612377 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:57 crc kubenswrapper[4883]: I0310 09:27:57.681054 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvb5h"] Mar 10 09:27:59 crc kubenswrapper[4883]: I0310 09:27:59.590490 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zvb5h" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="registry-server" containerID="cri-o://b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c" gracePeriod=2 Mar 10 09:27:59 crc kubenswrapper[4883]: I0310 09:27:59.961174 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.138567 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552248-gvmzc"] Mar 10 09:28:00 crc kubenswrapper[4883]: E0310 09:28:00.138960 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="extract-content" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.138977 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="extract-content" Mar 10 09:28:00 crc kubenswrapper[4883]: E0310 09:28:00.138987 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="registry-server" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.138994 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="registry-server" Mar 10 09:28:00 crc kubenswrapper[4883]: E0310 09:28:00.139015 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="extract-utilities" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.139021 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="extract-utilities" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.139184 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="registry-server" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.139781 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.141461 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.141701 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.141892 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.150739 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552248-gvmzc"] Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.152728 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-catalog-content\") pod \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.152759 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrlbz\" (UniqueName: \"kubernetes.io/projected/bcd77dbb-628a-4eb6-9910-6d75adb8025c-kube-api-access-zrlbz\") pod \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.152928 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-utilities\") pod \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.153578 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-utilities" (OuterVolumeSpecName: "utilities") pod "bcd77dbb-628a-4eb6-9910-6d75adb8025c" (UID: "bcd77dbb-628a-4eb6-9910-6d75adb8025c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.158958 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcd77dbb-628a-4eb6-9910-6d75adb8025c-kube-api-access-zrlbz" (OuterVolumeSpecName: "kube-api-access-zrlbz") pod "bcd77dbb-628a-4eb6-9910-6d75adb8025c" (UID: "bcd77dbb-628a-4eb6-9910-6d75adb8025c"). InnerVolumeSpecName "kube-api-access-zrlbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.256999 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcd77dbb-628a-4eb6-9910-6d75adb8025c" (UID: "bcd77dbb-628a-4eb6-9910-6d75adb8025c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.257049 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7svsl\" (UniqueName: \"kubernetes.io/projected/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce-kube-api-access-7svsl\") pod \"auto-csr-approver-29552248-gvmzc\" (UID: \"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce\") " pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.257354 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.257530 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrlbz\" (UniqueName: \"kubernetes.io/projected/bcd77dbb-628a-4eb6-9910-6d75adb8025c-kube-api-access-zrlbz\") on node \"crc\" DevicePath \"\"" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.257579 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.358633 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7svsl\" (UniqueName: \"kubernetes.io/projected/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce-kube-api-access-7svsl\") pod \"auto-csr-approver-29552248-gvmzc\" (UID: \"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce\") " pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.376321 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7svsl\" (UniqueName: \"kubernetes.io/projected/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce-kube-api-access-7svsl\") pod \"auto-csr-approver-29552248-gvmzc\" (UID: \"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce\") " pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.454173 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.609589 4883 generic.go:334] "Generic (PLEG): container finished" podID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerID="b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c" exitCode=0 Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.609935 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerDied","Data":"b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c"} Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.609984 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerDied","Data":"23cbf9373eb80662f59c04d25e32dcc2645848ba76f31abb0fbd36eb95635d39"} Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.610018 4883 scope.go:117] "RemoveContainer" containerID="b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.610219 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.657906 4883 scope.go:117] "RemoveContainer" containerID="60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.667252 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvb5h"] Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.675015 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zvb5h"] Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.694826 4883 scope.go:117] "RemoveContainer" containerID="bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.740458 4883 scope.go:117] "RemoveContainer" containerID="b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c" Mar 10 09:28:00 crc kubenswrapper[4883]: E0310 09:28:00.740900 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c\": container with ID starting with b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c not found: ID does not exist" containerID="b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.740953 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c"} err="failed to get container status \"b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c\": rpc error: code = NotFound desc = could not find container \"b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c\": container with ID starting with b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c not found: ID does not exist" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.740989 4883 scope.go:117] "RemoveContainer" containerID="60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1" Mar 10 09:28:00 crc kubenswrapper[4883]: E0310 09:28:00.741334 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1\": container with ID starting with 60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1 not found: ID does not exist" containerID="60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.741366 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1"} err="failed to get container status \"60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1\": rpc error: code = NotFound desc = could not find container \"60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1\": container with ID starting with 60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1 not found: ID does not exist" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.741389 4883 scope.go:117] "RemoveContainer" containerID="bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f" Mar 10 09:28:00 crc kubenswrapper[4883]: E0310 09:28:00.741751 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f\": container with ID starting with bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f not found: ID does not exist" containerID="bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.741777 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f"} err="failed to get container status \"bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f\": rpc error: code = NotFound desc = could not find container \"bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f\": container with ID starting with bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f not found: ID does not exist" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.857067 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552248-gvmzc"] Mar 10 09:28:01 crc kubenswrapper[4883]: I0310 09:28:01.620754 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" event={"ID":"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce","Type":"ContainerStarted","Data":"ca84c9526f517c8e62b25d8452228a7e93d95d315eeae37348887444851278b0"} Mar 10 09:28:02 crc kubenswrapper[4883]: I0310 09:28:02.090855 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" path="/var/lib/kubelet/pods/bcd77dbb-628a-4eb6-9910-6d75adb8025c/volumes" Mar 10 09:28:02 crc kubenswrapper[4883]: I0310 09:28:02.633868 4883 generic.go:334] "Generic (PLEG): container finished" podID="804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce" containerID="bec7039a9730daef00f6f750a9ad04bbfacd1a82b7aa8fa9152df9b60b9ae6d2" exitCode=0 Mar 10 09:28:02 crc kubenswrapper[4883]: I0310 09:28:02.633986 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" event={"ID":"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce","Type":"ContainerDied","Data":"bec7039a9730daef00f6f750a9ad04bbfacd1a82b7aa8fa9152df9b60b9ae6d2"} Mar 10 09:28:03 crc kubenswrapper[4883]: I0310 09:28:03.762779 4883 scope.go:117] "RemoveContainer" containerID="86f2b3c9600146e785777999cdc5d4ea906b5ad635853fcbc695d4a1b48ea493" Mar 10 09:28:03 crc kubenswrapper[4883]: I0310 09:28:03.915080 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:03 crc kubenswrapper[4883]: I0310 09:28:03.936500 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7svsl\" (UniqueName: \"kubernetes.io/projected/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce-kube-api-access-7svsl\") pod \"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce\" (UID: \"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce\") " Mar 10 09:28:03 crc kubenswrapper[4883]: I0310 09:28:03.942093 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce-kube-api-access-7svsl" (OuterVolumeSpecName: "kube-api-access-7svsl") pod "804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce" (UID: "804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce"). InnerVolumeSpecName "kube-api-access-7svsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:28:04 crc kubenswrapper[4883]: I0310 09:28:04.040138 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7svsl\" (UniqueName: \"kubernetes.io/projected/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce-kube-api-access-7svsl\") on node \"crc\" DevicePath \"\"" Mar 10 09:28:04 crc kubenswrapper[4883]: I0310 09:28:04.652314 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" event={"ID":"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce","Type":"ContainerDied","Data":"ca84c9526f517c8e62b25d8452228a7e93d95d315eeae37348887444851278b0"} Mar 10 09:28:04 crc kubenswrapper[4883]: I0310 09:28:04.652691 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca84c9526f517c8e62b25d8452228a7e93d95d315eeae37348887444851278b0" Mar 10 09:28:04 crc kubenswrapper[4883]: I0310 09:28:04.652386 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:04 crc kubenswrapper[4883]: I0310 09:28:04.970246 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552242-kz9jr"] Mar 10 09:28:04 crc kubenswrapper[4883]: I0310 09:28:04.976533 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552242-kz9jr"] Mar 10 09:28:06 crc kubenswrapper[4883]: I0310 09:28:06.089732 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12cada45-6ba5-4db1-9a13-3de652b390bb" path="/var/lib/kubelet/pods/12cada45-6ba5-4db1-9a13-3de652b390bb/volumes" Mar 10 09:28:47 crc kubenswrapper[4883]: I0310 09:28:47.449501 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:28:47 crc kubenswrapper[4883]: I0310 09:28:47.450118 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:29:03 crc kubenswrapper[4883]: I0310 09:29:03.848983 4883 scope.go:117] "RemoveContainer" containerID="ed2fdfaa94f84ea6d60f3a225de89340f67d80aec7fa6c18b378eb49e482b9b8" Mar 10 09:29:03 crc kubenswrapper[4883]: I0310 09:29:03.889141 4883 scope.go:117] "RemoveContainer" containerID="969226cfd74c44f03958486fd0daa3fd76ae699fc5b9689135438015e7c4fbc5" Mar 10 09:29:03 crc kubenswrapper[4883]: I0310 09:29:03.911117 4883 scope.go:117] "RemoveContainer" containerID="abd0d4cdc5283d36393d37385d0b14c08a59b31c42729a63940bf76c4dcfaa0e" Mar 10 09:29:03 crc kubenswrapper[4883]: I0310 09:29:03.938205 4883 scope.go:117] "RemoveContainer" containerID="8a043f976dc0ec960bd4342407fe8ea99f6aca698feb5f8a45170e566caea1a7" Mar 10 09:29:03 crc kubenswrapper[4883]: I0310 09:29:03.953563 4883 scope.go:117] "RemoveContainer" containerID="11797267a9434882035fa9951a681f2d2f7a8ff3989245476cda7f186fafdc21" Mar 10 09:29:17 crc kubenswrapper[4883]: I0310 09:29:17.449025 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:29:17 crc kubenswrapper[4883]: I0310 09:29:17.449678 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.448537 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.449194 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.449256 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.449803 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.449858 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" gracePeriod=600 Mar 10 09:29:47 crc kubenswrapper[4883]: E0310 09:29:47.572347 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.636343 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" exitCode=0 Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.636396 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b"} Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.636441 4883 scope.go:117] "RemoveContainer" containerID="3173da1ffe8f7435768fc32262b7cd69b641562a52821e6d42f580eafd7ff225" Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.637161 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:29:47 crc kubenswrapper[4883]: E0310 09:29:47.637390 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:29:58 crc kubenswrapper[4883]: I0310 09:29:58.080228 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:29:58 crc kubenswrapper[4883]: E0310 09:29:58.081082 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.140059 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552250-bzz2p"] Mar 10 09:30:00 crc kubenswrapper[4883]: E0310 09:30:00.141778 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce" containerName="oc" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.141856 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce" containerName="oc" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.142158 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce" containerName="oc" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.143113 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.146089 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.146313 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.146856 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.148039 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552250-bzz2p"] Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.240078 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6"] Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.241694 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.243888 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.244854 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.246784 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6"] Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.313057 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg4sh\" (UniqueName: \"kubernetes.io/projected/aea7fca8-0ec0-44f9-b729-2c150761519f-kube-api-access-dg4sh\") pod \"auto-csr-approver-29552250-bzz2p\" (UID: \"aea7fca8-0ec0-44f9-b729-2c150761519f\") " pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.414967 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg4sh\" (UniqueName: \"kubernetes.io/projected/aea7fca8-0ec0-44f9-b729-2c150761519f-kube-api-access-dg4sh\") pod \"auto-csr-approver-29552250-bzz2p\" (UID: \"aea7fca8-0ec0-44f9-b729-2c150761519f\") " pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.415080 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a598a9af-7896-474b-8a2d-8b912f1e867f-secret-volume\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.415149 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ntnk\" (UniqueName: \"kubernetes.io/projected/a598a9af-7896-474b-8a2d-8b912f1e867f-kube-api-access-4ntnk\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.415251 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a598a9af-7896-474b-8a2d-8b912f1e867f-config-volume\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.433152 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg4sh\" (UniqueName: \"kubernetes.io/projected/aea7fca8-0ec0-44f9-b729-2c150761519f-kube-api-access-dg4sh\") pod \"auto-csr-approver-29552250-bzz2p\" (UID: \"aea7fca8-0ec0-44f9-b729-2c150761519f\") " pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.462968 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.518371 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ntnk\" (UniqueName: \"kubernetes.io/projected/a598a9af-7896-474b-8a2d-8b912f1e867f-kube-api-access-4ntnk\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.519565 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a598a9af-7896-474b-8a2d-8b912f1e867f-config-volume\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.520009 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a598a9af-7896-474b-8a2d-8b912f1e867f-secret-volume\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.521272 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a598a9af-7896-474b-8a2d-8b912f1e867f-config-volume\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.524449 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a598a9af-7896-474b-8a2d-8b912f1e867f-secret-volume\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.533409 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ntnk\" (UniqueName: \"kubernetes.io/projected/a598a9af-7896-474b-8a2d-8b912f1e867f-kube-api-access-4ntnk\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.557509 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.860536 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552250-bzz2p"] Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.871510 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.977244 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6"] Mar 10 09:30:00 crc kubenswrapper[4883]: W0310 09:30:00.980961 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda598a9af_7896_474b_8a2d_8b912f1e867f.slice/crio-b38fc3101cd9e0c4e2d833a3c4a54b95259742f41b6955bec8a9cff5f2a9f215 WatchSource:0}: Error finding container b38fc3101cd9e0c4e2d833a3c4a54b95259742f41b6955bec8a9cff5f2a9f215: Status 404 returned error can't find the container with id b38fc3101cd9e0c4e2d833a3c4a54b95259742f41b6955bec8a9cff5f2a9f215 Mar 10 09:30:01 crc kubenswrapper[4883]: I0310 09:30:01.764290 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" event={"ID":"aea7fca8-0ec0-44f9-b729-2c150761519f","Type":"ContainerStarted","Data":"c7563826ba891aedae5e9849bbbf3e1d93261f7df06477e9ab1e881e2e951231"} Mar 10 09:30:01 crc kubenswrapper[4883]: I0310 09:30:01.766460 4883 generic.go:334] "Generic (PLEG): container finished" podID="a598a9af-7896-474b-8a2d-8b912f1e867f" containerID="007d3e7664b78fdbe1537c1501b8bf98b8877dc94b31a1d901b914a86cb7fa02" exitCode=0 Mar 10 09:30:01 crc kubenswrapper[4883]: I0310 09:30:01.766518 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" event={"ID":"a598a9af-7896-474b-8a2d-8b912f1e867f","Type":"ContainerDied","Data":"007d3e7664b78fdbe1537c1501b8bf98b8877dc94b31a1d901b914a86cb7fa02"} Mar 10 09:30:01 crc kubenswrapper[4883]: I0310 09:30:01.766541 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" event={"ID":"a598a9af-7896-474b-8a2d-8b912f1e867f","Type":"ContainerStarted","Data":"b38fc3101cd9e0c4e2d833a3c4a54b95259742f41b6955bec8a9cff5f2a9f215"} Mar 10 09:30:02 crc kubenswrapper[4883]: I0310 09:30:02.778378 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" event={"ID":"aea7fca8-0ec0-44f9-b729-2c150761519f","Type":"ContainerStarted","Data":"55b96e78b0c842f2c57bd194eb8366216f87ea2f29ff4952bc2e23513be83089"} Mar 10 09:30:02 crc kubenswrapper[4883]: I0310 09:30:02.797996 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" podStartSLOduration=1.241920905 podStartE2EDuration="2.797975532s" podCreationTimestamp="2026-03-10 09:30:00 +0000 UTC" firstStartedPulling="2026-03-10 09:30:00.871206703 +0000 UTC m=+1587.126104592" lastFinishedPulling="2026-03-10 09:30:02.42726133 +0000 UTC m=+1588.682159219" observedRunningTime="2026-03-10 09:30:02.794449323 +0000 UTC m=+1589.049347212" watchObservedRunningTime="2026-03-10 09:30:02.797975532 +0000 UTC m=+1589.052873421" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.076183 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.272307 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a598a9af-7896-474b-8a2d-8b912f1e867f-secret-volume\") pod \"a598a9af-7896-474b-8a2d-8b912f1e867f\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.272387 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ntnk\" (UniqueName: \"kubernetes.io/projected/a598a9af-7896-474b-8a2d-8b912f1e867f-kube-api-access-4ntnk\") pod \"a598a9af-7896-474b-8a2d-8b912f1e867f\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.272568 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a598a9af-7896-474b-8a2d-8b912f1e867f-config-volume\") pod \"a598a9af-7896-474b-8a2d-8b912f1e867f\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.273775 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a598a9af-7896-474b-8a2d-8b912f1e867f-config-volume" (OuterVolumeSpecName: "config-volume") pod "a598a9af-7896-474b-8a2d-8b912f1e867f" (UID: "a598a9af-7896-474b-8a2d-8b912f1e867f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.277671 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a598a9af-7896-474b-8a2d-8b912f1e867f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a598a9af-7896-474b-8a2d-8b912f1e867f" (UID: "a598a9af-7896-474b-8a2d-8b912f1e867f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.278397 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a598a9af-7896-474b-8a2d-8b912f1e867f-kube-api-access-4ntnk" (OuterVolumeSpecName: "kube-api-access-4ntnk") pod "a598a9af-7896-474b-8a2d-8b912f1e867f" (UID: "a598a9af-7896-474b-8a2d-8b912f1e867f"). InnerVolumeSpecName "kube-api-access-4ntnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.374558 4883 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a598a9af-7896-474b-8a2d-8b912f1e867f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.374609 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ntnk\" (UniqueName: \"kubernetes.io/projected/a598a9af-7896-474b-8a2d-8b912f1e867f-kube-api-access-4ntnk\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.374619 4883 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a598a9af-7896-474b-8a2d-8b912f1e867f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.791961 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" event={"ID":"a598a9af-7896-474b-8a2d-8b912f1e867f","Type":"ContainerDied","Data":"b38fc3101cd9e0c4e2d833a3c4a54b95259742f41b6955bec8a9cff5f2a9f215"} Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.792328 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b38fc3101cd9e0c4e2d833a3c4a54b95259742f41b6955bec8a9cff5f2a9f215" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.792196 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.794423 4883 generic.go:334] "Generic (PLEG): container finished" podID="aea7fca8-0ec0-44f9-b729-2c150761519f" containerID="55b96e78b0c842f2c57bd194eb8366216f87ea2f29ff4952bc2e23513be83089" exitCode=0 Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.794468 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" event={"ID":"aea7fca8-0ec0-44f9-b729-2c150761519f","Type":"ContainerDied","Data":"55b96e78b0c842f2c57bd194eb8366216f87ea2f29ff4952bc2e23513be83089"} Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.088117 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.207331 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg4sh\" (UniqueName: \"kubernetes.io/projected/aea7fca8-0ec0-44f9-b729-2c150761519f-kube-api-access-dg4sh\") pod \"aea7fca8-0ec0-44f9-b729-2c150761519f\" (UID: \"aea7fca8-0ec0-44f9-b729-2c150761519f\") " Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.211767 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea7fca8-0ec0-44f9-b729-2c150761519f-kube-api-access-dg4sh" (OuterVolumeSpecName: "kube-api-access-dg4sh") pod "aea7fca8-0ec0-44f9-b729-2c150761519f" (UID: "aea7fca8-0ec0-44f9-b729-2c150761519f"). InnerVolumeSpecName "kube-api-access-dg4sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.310227 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg4sh\" (UniqueName: \"kubernetes.io/projected/aea7fca8-0ec0-44f9-b729-2c150761519f-kube-api-access-dg4sh\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.816141 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" event={"ID":"aea7fca8-0ec0-44f9-b729-2c150761519f","Type":"ContainerDied","Data":"c7563826ba891aedae5e9849bbbf3e1d93261f7df06477e9ab1e881e2e951231"} Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.816191 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.816205 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7563826ba891aedae5e9849bbbf3e1d93261f7df06477e9ab1e881e2e951231" Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.856938 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552244-zwxrg"] Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.863740 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552244-zwxrg"] Mar 10 09:30:06 crc kubenswrapper[4883]: I0310 09:30:06.090290 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="391543cc-519b-4e01-8886-04bde62c5298" path="/var/lib/kubelet/pods/391543cc-519b-4e01-8886-04bde62c5298/volumes" Mar 10 09:30:13 crc kubenswrapper[4883]: I0310 09:30:13.080852 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:30:13 crc kubenswrapper[4883]: E0310 09:30:13.081520 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:30:24 crc kubenswrapper[4883]: I0310 09:30:24.085951 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:30:24 crc kubenswrapper[4883]: E0310 09:30:24.087105 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:30:28 crc kubenswrapper[4883]: I0310 09:30:28.027730 4883 generic.go:334] "Generic (PLEG): container finished" podID="de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" containerID="5043f618c5751d9b1d780a8c20af9397bbb12825aec770e998b692d0b3a30888" exitCode=0 Mar 10 09:30:28 crc kubenswrapper[4883]: I0310 09:30:28.027835 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" event={"ID":"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd","Type":"ContainerDied","Data":"5043f618c5751d9b1d780a8c20af9397bbb12825aec770e998b692d0b3a30888"} Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.382148 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.482804 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-inventory\") pod \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.483042 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqgvq\" (UniqueName: \"kubernetes.io/projected/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-kube-api-access-vqgvq\") pod \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.483076 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-ssh-key-openstack-edpm-ipam\") pod \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.483333 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-bootstrap-combined-ca-bundle\") pod \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.488866 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" (UID: "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.489871 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-kube-api-access-vqgvq" (OuterVolumeSpecName: "kube-api-access-vqgvq") pod "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" (UID: "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd"). InnerVolumeSpecName "kube-api-access-vqgvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.506305 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" (UID: "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.509155 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-inventory" (OuterVolumeSpecName: "inventory") pod "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" (UID: "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.585182 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqgvq\" (UniqueName: \"kubernetes.io/projected/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-kube-api-access-vqgvq\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.585213 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.585224 4883 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.585237 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.052559 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" event={"ID":"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd","Type":"ContainerDied","Data":"733b4fd44a5ac5d10292b081b7fba6dec6ce4014fc5f18ef316d40cce4e76602"} Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.052627 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="733b4fd44a5ac5d10292b081b7fba6dec6ce4014fc5f18ef316d40cce4e76602" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.052726 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.124612 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr"] Mar 10 09:30:30 crc kubenswrapper[4883]: E0310 09:30:30.125112 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a598a9af-7896-474b-8a2d-8b912f1e867f" containerName="collect-profiles" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.125134 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a598a9af-7896-474b-8a2d-8b912f1e867f" containerName="collect-profiles" Mar 10 09:30:30 crc kubenswrapper[4883]: E0310 09:30:30.125150 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea7fca8-0ec0-44f9-b729-2c150761519f" containerName="oc" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.125156 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea7fca8-0ec0-44f9-b729-2c150761519f" containerName="oc" Mar 10 09:30:30 crc kubenswrapper[4883]: E0310 09:30:30.125176 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.125184 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.125370 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.125388 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="a598a9af-7896-474b-8a2d-8b912f1e867f" containerName="collect-profiles" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.125400 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea7fca8-0ec0-44f9-b729-2c150761519f" containerName="oc" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.126071 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.129082 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.129335 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.129948 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.130113 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.147746 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr"] Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.194592 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqrjd\" (UniqueName: \"kubernetes.io/projected/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-kube-api-access-cqrjd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.194745 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.194808 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.298375 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqrjd\" (UniqueName: \"kubernetes.io/projected/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-kube-api-access-cqrjd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.298884 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.299290 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.303963 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.304234 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.318075 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqrjd\" (UniqueName: \"kubernetes.io/projected/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-kube-api-access-cqrjd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.440086 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.909925 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr"] Mar 10 09:30:31 crc kubenswrapper[4883]: I0310 09:30:31.063250 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" event={"ID":"2428d4e5-b48e-45ad-9bfb-711c3b1e8471","Type":"ContainerStarted","Data":"d875b5a0a6f7adae1cffd30aa5fa08f42f89b6a3458e653c961dbd1e2be218bd"} Mar 10 09:30:32 crc kubenswrapper[4883]: I0310 09:30:32.076855 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" event={"ID":"2428d4e5-b48e-45ad-9bfb-711c3b1e8471","Type":"ContainerStarted","Data":"6e2ca16529cbec88524c07de2c8616688102bc3697fc5f0c4b0e4d88eda0ea79"} Mar 10 09:30:32 crc kubenswrapper[4883]: I0310 09:30:32.099964 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" podStartSLOduration=1.437661975 podStartE2EDuration="2.099942071s" podCreationTimestamp="2026-03-10 09:30:30 +0000 UTC" firstStartedPulling="2026-03-10 09:30:30.913239466 +0000 UTC m=+1617.168137355" lastFinishedPulling="2026-03-10 09:30:31.575519561 +0000 UTC m=+1617.830417451" observedRunningTime="2026-03-10 09:30:32.093008987 +0000 UTC m=+1618.347906876" watchObservedRunningTime="2026-03-10 09:30:32.099942071 +0000 UTC m=+1618.354839959" Mar 10 09:30:37 crc kubenswrapper[4883]: I0310 09:30:37.080036 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:30:37 crc kubenswrapper[4883]: E0310 09:30:37.081013 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.146207 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6v8n9"] Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.148760 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.158828 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6v8n9"] Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.191297 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-utilities\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.191392 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-catalog-content\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.191469 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5wh8\" (UniqueName: \"kubernetes.io/projected/7873bd43-b295-4b71-bc81-1d7c3a894778-kube-api-access-d5wh8\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.292873 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-catalog-content\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.293398 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-catalog-content\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.293522 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5wh8\" (UniqueName: \"kubernetes.io/projected/7873bd43-b295-4b71-bc81-1d7c3a894778-kube-api-access-d5wh8\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.293712 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-utilities\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.294183 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-utilities\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.311798 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5wh8\" (UniqueName: \"kubernetes.io/projected/7873bd43-b295-4b71-bc81-1d7c3a894778-kube-api-access-d5wh8\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.465433 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.939386 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6v8n9"] Mar 10 09:30:46 crc kubenswrapper[4883]: I0310 09:30:46.217749 4883 generic.go:334] "Generic (PLEG): container finished" podID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerID="03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d" exitCode=0 Mar 10 09:30:46 crc kubenswrapper[4883]: I0310 09:30:46.217838 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v8n9" event={"ID":"7873bd43-b295-4b71-bc81-1d7c3a894778","Type":"ContainerDied","Data":"03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d"} Mar 10 09:30:46 crc kubenswrapper[4883]: I0310 09:30:46.217991 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v8n9" event={"ID":"7873bd43-b295-4b71-bc81-1d7c3a894778","Type":"ContainerStarted","Data":"6219b9d72ad89a4d1269b3954abbe663509aa516271b361aed1997777f6d30fc"} Mar 10 09:30:48 crc kubenswrapper[4883]: I0310 09:30:48.079910 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:30:48 crc kubenswrapper[4883]: E0310 09:30:48.080880 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:30:48 crc kubenswrapper[4883]: I0310 09:30:48.241348 4883 generic.go:334] "Generic (PLEG): container finished" podID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerID="a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7" exitCode=0 Mar 10 09:30:48 crc kubenswrapper[4883]: I0310 09:30:48.241415 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v8n9" event={"ID":"7873bd43-b295-4b71-bc81-1d7c3a894778","Type":"ContainerDied","Data":"a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7"} Mar 10 09:30:49 crc kubenswrapper[4883]: I0310 09:30:49.251864 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v8n9" event={"ID":"7873bd43-b295-4b71-bc81-1d7c3a894778","Type":"ContainerStarted","Data":"0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22"} Mar 10 09:30:49 crc kubenswrapper[4883]: I0310 09:30:49.272843 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6v8n9" podStartSLOduration=1.740131126 podStartE2EDuration="4.272826526s" podCreationTimestamp="2026-03-10 09:30:45 +0000 UTC" firstStartedPulling="2026-03-10 09:30:46.219393481 +0000 UTC m=+1632.474291370" lastFinishedPulling="2026-03-10 09:30:48.752088881 +0000 UTC m=+1635.006986770" observedRunningTime="2026-03-10 09:30:49.266506739 +0000 UTC m=+1635.521404628" watchObservedRunningTime="2026-03-10 09:30:49.272826526 +0000 UTC m=+1635.527724405" Mar 10 09:30:55 crc kubenswrapper[4883]: I0310 09:30:55.466101 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:55 crc kubenswrapper[4883]: I0310 09:30:55.466678 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:55 crc kubenswrapper[4883]: I0310 09:30:55.511309 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:56 crc kubenswrapper[4883]: I0310 09:30:56.367606 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:56 crc kubenswrapper[4883]: I0310 09:30:56.410208 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6v8n9"] Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.345548 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6v8n9" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="registry-server" containerID="cri-o://0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22" gracePeriod=2 Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.779566 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.901894 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5wh8\" (UniqueName: \"kubernetes.io/projected/7873bd43-b295-4b71-bc81-1d7c3a894778-kube-api-access-d5wh8\") pod \"7873bd43-b295-4b71-bc81-1d7c3a894778\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.902053 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-catalog-content\") pod \"7873bd43-b295-4b71-bc81-1d7c3a894778\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.902181 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-utilities\") pod \"7873bd43-b295-4b71-bc81-1d7c3a894778\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.903312 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-utilities" (OuterVolumeSpecName: "utilities") pod "7873bd43-b295-4b71-bc81-1d7c3a894778" (UID: "7873bd43-b295-4b71-bc81-1d7c3a894778"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.908621 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7873bd43-b295-4b71-bc81-1d7c3a894778-kube-api-access-d5wh8" (OuterVolumeSpecName: "kube-api-access-d5wh8") pod "7873bd43-b295-4b71-bc81-1d7c3a894778" (UID: "7873bd43-b295-4b71-bc81-1d7c3a894778"). InnerVolumeSpecName "kube-api-access-d5wh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.953561 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7873bd43-b295-4b71-bc81-1d7c3a894778" (UID: "7873bd43-b295-4b71-bc81-1d7c3a894778"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.006310 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5wh8\" (UniqueName: \"kubernetes.io/projected/7873bd43-b295-4b71-bc81-1d7c3a894778-kube-api-access-d5wh8\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.006361 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.006375 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.081596 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:30:59 crc kubenswrapper[4883]: E0310 09:30:59.082188 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.359309 4883 generic.go:334] "Generic (PLEG): container finished" podID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerID="0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22" exitCode=0 Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.359377 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v8n9" event={"ID":"7873bd43-b295-4b71-bc81-1d7c3a894778","Type":"ContainerDied","Data":"0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22"} Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.359411 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.359435 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v8n9" event={"ID":"7873bd43-b295-4b71-bc81-1d7c3a894778","Type":"ContainerDied","Data":"6219b9d72ad89a4d1269b3954abbe663509aa516271b361aed1997777f6d30fc"} Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.359459 4883 scope.go:117] "RemoveContainer" containerID="0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.382327 4883 scope.go:117] "RemoveContainer" containerID="a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.393551 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6v8n9"] Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.401340 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6v8n9"] Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.422006 4883 scope.go:117] "RemoveContainer" containerID="03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.440950 4883 scope.go:117] "RemoveContainer" containerID="0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22" Mar 10 09:30:59 crc kubenswrapper[4883]: E0310 09:30:59.441509 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22\": container with ID starting with 0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22 not found: ID does not exist" containerID="0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.441550 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22"} err="failed to get container status \"0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22\": rpc error: code = NotFound desc = could not find container \"0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22\": container with ID starting with 0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22 not found: ID does not exist" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.441577 4883 scope.go:117] "RemoveContainer" containerID="a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7" Mar 10 09:30:59 crc kubenswrapper[4883]: E0310 09:30:59.441916 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7\": container with ID starting with a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7 not found: ID does not exist" containerID="a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.441948 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7"} err="failed to get container status \"a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7\": rpc error: code = NotFound desc = could not find container \"a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7\": container with ID starting with a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7 not found: ID does not exist" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.441970 4883 scope.go:117] "RemoveContainer" containerID="03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d" Mar 10 09:30:59 crc kubenswrapper[4883]: E0310 09:30:59.442259 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d\": container with ID starting with 03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d not found: ID does not exist" containerID="03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.442322 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d"} err="failed to get container status \"03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d\": rpc error: code = NotFound desc = could not find container \"03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d\": container with ID starting with 03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d not found: ID does not exist" Mar 10 09:31:00 crc kubenswrapper[4883]: I0310 09:31:00.094827 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" path="/var/lib/kubelet/pods/7873bd43-b295-4b71-bc81-1d7c3a894778/volumes" Mar 10 09:31:04 crc kubenswrapper[4883]: I0310 09:31:04.077661 4883 scope.go:117] "RemoveContainer" containerID="6ae90501c695dbc91d1432637dfb47eec6c627c8a17eb905f14422cd13154c3c" Mar 10 09:31:13 crc kubenswrapper[4883]: I0310 09:31:13.080283 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:31:13 crc kubenswrapper[4883]: E0310 09:31:13.081233 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:31:26 crc kubenswrapper[4883]: I0310 09:31:26.080698 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:31:26 crc kubenswrapper[4883]: E0310 09:31:26.081392 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:31:39 crc kubenswrapper[4883]: I0310 09:31:39.080793 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:31:39 crc kubenswrapper[4883]: E0310 09:31:39.081687 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:31:47 crc kubenswrapper[4883]: I0310 09:31:47.038111 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mj8nd"] Mar 10 09:31:47 crc kubenswrapper[4883]: I0310 09:31:47.044266 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-pv8r6"] Mar 10 09:31:47 crc kubenswrapper[4883]: I0310 09:31:47.051778 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8903-account-create-update-lxrp4"] Mar 10 09:31:47 crc kubenswrapper[4883]: I0310 09:31:47.056871 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mj8nd"] Mar 10 09:31:47 crc kubenswrapper[4883]: I0310 09:31:47.061936 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8903-account-create-update-lxrp4"] Mar 10 09:31:47 crc kubenswrapper[4883]: I0310 09:31:47.067034 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-pv8r6"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.029807 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c7c6-account-create-update-bzdlt"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.036059 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d500-account-create-update-fpfdr"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.041483 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-j9kwf"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.047445 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c7c6-account-create-update-bzdlt"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.053254 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d500-account-create-update-fpfdr"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.059350 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-j9kwf"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.087877 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258d7844-9a92-460a-a768-a5dca2fb5db9" path="/var/lib/kubelet/pods/258d7844-9a92-460a-a768-a5dca2fb5db9/volumes" Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.088539 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="486b3226-21be-4783-8b29-abaf747a7693" path="/var/lib/kubelet/pods/486b3226-21be-4783-8b29-abaf747a7693/volumes" Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.089113 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58599ed2-6176-4003-8bdc-2a1d805da51f" path="/var/lib/kubelet/pods/58599ed2-6176-4003-8bdc-2a1d805da51f/volumes" Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.089745 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6195b8a8-c8aa-4d92-b58b-066a2df99bd3" path="/var/lib/kubelet/pods/6195b8a8-c8aa-4d92-b58b-066a2df99bd3/volumes" Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.090773 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698612ed-a736-4d3d-9a0e-4c75fdd1400f" path="/var/lib/kubelet/pods/698612ed-a736-4d3d-9a0e-4c75fdd1400f/volumes" Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.091317 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" path="/var/lib/kubelet/pods/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34/volumes" Mar 10 09:31:54 crc kubenswrapper[4883]: I0310 09:31:54.085687 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:31:54 crc kubenswrapper[4883]: E0310 09:31:54.086765 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.137743 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552252-vcsb9"] Mar 10 09:32:00 crc kubenswrapper[4883]: E0310 09:32:00.138942 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="registry-server" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.138957 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="registry-server" Mar 10 09:32:00 crc kubenswrapper[4883]: E0310 09:32:00.138974 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="extract-content" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.138982 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="extract-content" Mar 10 09:32:00 crc kubenswrapper[4883]: E0310 09:32:00.139004 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="extract-utilities" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.139010 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="extract-utilities" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.139242 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="registry-server" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.139982 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.142234 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.142498 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.142625 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.152331 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552252-vcsb9"] Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.177816 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mltvg\" (UniqueName: \"kubernetes.io/projected/fc74aa89-09d6-4974-a6c1-1642f6ef0a64-kube-api-access-mltvg\") pod \"auto-csr-approver-29552252-vcsb9\" (UID: \"fc74aa89-09d6-4974-a6c1-1642f6ef0a64\") " pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.280401 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mltvg\" (UniqueName: \"kubernetes.io/projected/fc74aa89-09d6-4974-a6c1-1642f6ef0a64-kube-api-access-mltvg\") pod \"auto-csr-approver-29552252-vcsb9\" (UID: \"fc74aa89-09d6-4974-a6c1-1642f6ef0a64\") " pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.300194 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mltvg\" (UniqueName: \"kubernetes.io/projected/fc74aa89-09d6-4974-a6c1-1642f6ef0a64-kube-api-access-mltvg\") pod \"auto-csr-approver-29552252-vcsb9\" (UID: \"fc74aa89-09d6-4974-a6c1-1642f6ef0a64\") " pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.459307 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.846908 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552252-vcsb9"] Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.883130 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" event={"ID":"fc74aa89-09d6-4974-a6c1-1642f6ef0a64","Type":"ContainerStarted","Data":"bd31784591c30d4ac2e343f3fe05ea06508177f3dad3922412f06d03f3987004"} Mar 10 09:32:01 crc kubenswrapper[4883]: I0310 09:32:01.894710 4883 generic.go:334] "Generic (PLEG): container finished" podID="2428d4e5-b48e-45ad-9bfb-711c3b1e8471" containerID="6e2ca16529cbec88524c07de2c8616688102bc3697fc5f0c4b0e4d88eda0ea79" exitCode=0 Mar 10 09:32:01 crc kubenswrapper[4883]: I0310 09:32:01.894787 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" event={"ID":"2428d4e5-b48e-45ad-9bfb-711c3b1e8471","Type":"ContainerDied","Data":"6e2ca16529cbec88524c07de2c8616688102bc3697fc5f0c4b0e4d88eda0ea79"} Mar 10 09:32:02 crc kubenswrapper[4883]: I0310 09:32:02.905962 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc74aa89-09d6-4974-a6c1-1642f6ef0a64" containerID="76a36df1ff76227c193949f769a79a8229f0c35af6ce9046d5c6bb133c432611" exitCode=0 Mar 10 09:32:02 crc kubenswrapper[4883]: I0310 09:32:02.906082 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" event={"ID":"fc74aa89-09d6-4974-a6c1-1642f6ef0a64","Type":"ContainerDied","Data":"76a36df1ff76227c193949f769a79a8229f0c35af6ce9046d5c6bb133c432611"} Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.235298 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.343716 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-ssh-key-openstack-edpm-ipam\") pod \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.343777 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-inventory\") pod \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.343806 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqrjd\" (UniqueName: \"kubernetes.io/projected/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-kube-api-access-cqrjd\") pod \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.349204 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-kube-api-access-cqrjd" (OuterVolumeSpecName: "kube-api-access-cqrjd") pod "2428d4e5-b48e-45ad-9bfb-711c3b1e8471" (UID: "2428d4e5-b48e-45ad-9bfb-711c3b1e8471"). InnerVolumeSpecName "kube-api-access-cqrjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.367647 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2428d4e5-b48e-45ad-9bfb-711c3b1e8471" (UID: "2428d4e5-b48e-45ad-9bfb-711c3b1e8471"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.369773 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-inventory" (OuterVolumeSpecName: "inventory") pod "2428d4e5-b48e-45ad-9bfb-711c3b1e8471" (UID: "2428d4e5-b48e-45ad-9bfb-711c3b1e8471"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.447003 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.447039 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.447054 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqrjd\" (UniqueName: \"kubernetes.io/projected/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-kube-api-access-cqrjd\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.917998 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" event={"ID":"2428d4e5-b48e-45ad-9bfb-711c3b1e8471","Type":"ContainerDied","Data":"d875b5a0a6f7adae1cffd30aa5fa08f42f89b6a3458e653c961dbd1e2be218bd"} Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.918033 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.918061 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d875b5a0a6f7adae1cffd30aa5fa08f42f89b6a3458e653c961dbd1e2be218bd" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.987924 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm"] Mar 10 09:32:03 crc kubenswrapper[4883]: E0310 09:32:03.988538 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2428d4e5-b48e-45ad-9bfb-711c3b1e8471" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.988561 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="2428d4e5-b48e-45ad-9bfb-711c3b1e8471" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.988830 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="2428d4e5-b48e-45ad-9bfb-711c3b1e8471" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.989745 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.994548 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.994586 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.994639 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.994540 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.007168 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm"] Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.146789 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.163517 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.163609 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.163760 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dmqb\" (UniqueName: \"kubernetes.io/projected/07ddb6af-f2c7-46eb-aac4-fe69996caf27-kube-api-access-8dmqb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.164847 4883 scope.go:117] "RemoveContainer" containerID="067357e3a311966c2851280ac43f9cd60519f964886eed01635d2c8acbc8045b" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.209366 4883 scope.go:117] "RemoveContainer" containerID="86cc309342e04f12de9f243fac1e7adc270651f62f05738383b3854942ebc072" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.238685 4883 scope.go:117] "RemoveContainer" containerID="9b3a01ef455743297929fe3e8d915e6b5c1a6d87ee8313151edd54b3c5c1c1d3" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.265259 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mltvg\" (UniqueName: \"kubernetes.io/projected/fc74aa89-09d6-4974-a6c1-1642f6ef0a64-kube-api-access-mltvg\") pod \"fc74aa89-09d6-4974-a6c1-1642f6ef0a64\" (UID: \"fc74aa89-09d6-4974-a6c1-1642f6ef0a64\") " Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.265741 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.265793 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.265880 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dmqb\" (UniqueName: \"kubernetes.io/projected/07ddb6af-f2c7-46eb-aac4-fe69996caf27-kube-api-access-8dmqb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.270655 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc74aa89-09d6-4974-a6c1-1642f6ef0a64-kube-api-access-mltvg" (OuterVolumeSpecName: "kube-api-access-mltvg") pod "fc74aa89-09d6-4974-a6c1-1642f6ef0a64" (UID: "fc74aa89-09d6-4974-a6c1-1642f6ef0a64"). InnerVolumeSpecName "kube-api-access-mltvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.271355 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.271351 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.272043 4883 scope.go:117] "RemoveContainer" containerID="e7877e4d896a5e48fb94d0bb9e636d179a97dbbe531d524d3bf059533ec08d74" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.281774 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dmqb\" (UniqueName: \"kubernetes.io/projected/07ddb6af-f2c7-46eb-aac4-fe69996caf27-kube-api-access-8dmqb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.306041 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.338104 4883 scope.go:117] "RemoveContainer" containerID="6677f5c2edc8cf5df63041699d2713762ffd5b4bdf18bb3f374e397d55004166" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.358711 4883 scope.go:117] "RemoveContainer" containerID="b981b386d21855c9b21b1262acdcccebfb4995ef8da840373e95a5a29e03699c" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.368504 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mltvg\" (UniqueName: \"kubernetes.io/projected/fc74aa89-09d6-4974-a6c1-1642f6ef0a64-kube-api-access-mltvg\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.781888 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm"] Mar 10 09:32:04 crc kubenswrapper[4883]: W0310 09:32:04.786004 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07ddb6af_f2c7_46eb_aac4_fe69996caf27.slice/crio-1a759c64eb7a3ebc65aa948dfaa6cb205a6cdae4383067868312470bfcf3e829 WatchSource:0}: Error finding container 1a759c64eb7a3ebc65aa948dfaa6cb205a6cdae4383067868312470bfcf3e829: Status 404 returned error can't find the container with id 1a759c64eb7a3ebc65aa948dfaa6cb205a6cdae4383067868312470bfcf3e829 Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.927851 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" event={"ID":"fc74aa89-09d6-4974-a6c1-1642f6ef0a64","Type":"ContainerDied","Data":"bd31784591c30d4ac2e343f3fe05ea06508177f3dad3922412f06d03f3987004"} Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.927899 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.927922 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd31784591c30d4ac2e343f3fe05ea06508177f3dad3922412f06d03f3987004" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.930502 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" event={"ID":"07ddb6af-f2c7-46eb-aac4-fe69996caf27","Type":"ContainerStarted","Data":"1a759c64eb7a3ebc65aa948dfaa6cb205a6cdae4383067868312470bfcf3e829"} Mar 10 09:32:05 crc kubenswrapper[4883]: I0310 09:32:05.216712 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552246-r4bxm"] Mar 10 09:32:05 crc kubenswrapper[4883]: I0310 09:32:05.224993 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552246-r4bxm"] Mar 10 09:32:05 crc kubenswrapper[4883]: I0310 09:32:05.942351 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" event={"ID":"07ddb6af-f2c7-46eb-aac4-fe69996caf27","Type":"ContainerStarted","Data":"2bfb6eb0197869370269e475b7e819de3708dd20b2a400d6accc4988b6fa951b"} Mar 10 09:32:06 crc kubenswrapper[4883]: I0310 09:32:06.089677 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcfbbeba-ae1f-4e53-ba68-3cc981395803" path="/var/lib/kubelet/pods/bcfbbeba-ae1f-4e53-ba68-3cc981395803/volumes" Mar 10 09:32:07 crc kubenswrapper[4883]: I0310 09:32:07.080711 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:32:07 crc kubenswrapper[4883]: E0310 09:32:07.081268 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:32:10 crc kubenswrapper[4883]: I0310 09:32:10.023123 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" podStartSLOduration=6.447872441 podStartE2EDuration="7.023096805s" podCreationTimestamp="2026-03-10 09:32:03 +0000 UTC" firstStartedPulling="2026-03-10 09:32:04.788285587 +0000 UTC m=+1711.043183477" lastFinishedPulling="2026-03-10 09:32:05.363509941 +0000 UTC m=+1711.618407841" observedRunningTime="2026-03-10 09:32:05.979844162 +0000 UTC m=+1712.234742051" watchObservedRunningTime="2026-03-10 09:32:10.023096805 +0000 UTC m=+1716.277994695" Mar 10 09:32:10 crc kubenswrapper[4883]: I0310 09:32:10.032820 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cdc4t"] Mar 10 09:32:10 crc kubenswrapper[4883]: I0310 09:32:10.039980 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-cdc4t"] Mar 10 09:32:10 crc kubenswrapper[4883]: I0310 09:32:10.092501 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d523ed0-183e-4bec-a110-fe622b69ef79" path="/var/lib/kubelet/pods/5d523ed0-183e-4bec-a110-fe622b69ef79/volumes" Mar 10 09:32:13 crc kubenswrapper[4883]: I0310 09:32:13.029561 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-xtrqg"] Mar 10 09:32:13 crc kubenswrapper[4883]: I0310 09:32:13.035076 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-xtrqg"] Mar 10 09:32:14 crc kubenswrapper[4883]: I0310 09:32:14.092842 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5485539-c722-477d-b595-649e07eac50e" path="/var/lib/kubelet/pods/d5485539-c722-477d-b595-649e07eac50e/volumes" Mar 10 09:32:19 crc kubenswrapper[4883]: I0310 09:32:19.079871 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:32:19 crc kubenswrapper[4883]: E0310 09:32:19.080935 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:32:25 crc kubenswrapper[4883]: I0310 09:32:25.046227 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-27c8-account-create-update-9w2q5"] Mar 10 09:32:25 crc kubenswrapper[4883]: I0310 09:32:25.052738 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ef9a-account-create-update-4bwrd"] Mar 10 09:32:25 crc kubenswrapper[4883]: I0310 09:32:25.057937 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-27c8-account-create-update-9w2q5"] Mar 10 09:32:25 crc kubenswrapper[4883]: I0310 09:32:25.063032 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ef9a-account-create-update-4bwrd"] Mar 10 09:32:26 crc kubenswrapper[4883]: I0310 09:32:26.089939 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" path="/var/lib/kubelet/pods/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7/volumes" Mar 10 09:32:26 crc kubenswrapper[4883]: I0310 09:32:26.090604 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed176738-d518-45e3-be47-3ace090d0e7a" path="/var/lib/kubelet/pods/ed176738-d518-45e3-be47-3ace090d0e7a/volumes" Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.027716 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-8664j"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.034878 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-wzxkv"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.040535 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hrq22"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.045437 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-8664j"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.050433 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-fca8-account-create-update-7jkwx"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.055730 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-wzxkv"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.060839 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hrq22"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.065998 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-fca8-account-create-update-7jkwx"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.091813 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a8b78f-e864-49d5-9dfb-aebd86741885" path="/var/lib/kubelet/pods/07a8b78f-e864-49d5-9dfb-aebd86741885/volumes" Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.092387 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24713bd6-5868-43ec-94ec-2371a49a0b88" path="/var/lib/kubelet/pods/24713bd6-5868-43ec-94ec-2371a49a0b88/volumes" Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.092959 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94df275b-e089-4e1f-8eac-e4806d2f1178" path="/var/lib/kubelet/pods/94df275b-e089-4e1f-8eac-e4806d2f1178/volumes" Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.093542 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed7aa202-c734-4333-a1de-1bdb39d59804" path="/var/lib/kubelet/pods/ed7aa202-c734-4333-a1de-1bdb39d59804/volumes" Mar 10 09:32:31 crc kubenswrapper[4883]: I0310 09:32:31.026527 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-w5q6s"] Mar 10 09:32:31 crc kubenswrapper[4883]: I0310 09:32:31.032725 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-w5q6s"] Mar 10 09:32:32 crc kubenswrapper[4883]: I0310 09:32:32.093794 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96942836-243a-48c5-be3d-5eb5e5f166d0" path="/var/lib/kubelet/pods/96942836-243a-48c5-be3d-5eb5e5f166d0/volumes" Mar 10 09:32:33 crc kubenswrapper[4883]: I0310 09:32:33.079811 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:32:33 crc kubenswrapper[4883]: E0310 09:32:33.080346 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:32:45 crc kubenswrapper[4883]: I0310 09:32:45.079715 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:32:45 crc kubenswrapper[4883]: E0310 09:32:45.080321 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:32:54 crc kubenswrapper[4883]: I0310 09:32:54.034834 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-kcmgr"] Mar 10 09:32:54 crc kubenswrapper[4883]: I0310 09:32:54.040628 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-kcmgr"] Mar 10 09:32:54 crc kubenswrapper[4883]: I0310 09:32:54.138705 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5a25758-cb77-448b-a856-3dbc6df2bc21" path="/var/lib/kubelet/pods/f5a25758-cb77-448b-a856-3dbc6df2bc21/volumes" Mar 10 09:32:57 crc kubenswrapper[4883]: I0310 09:32:57.080062 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:32:57 crc kubenswrapper[4883]: E0310 09:32:57.080434 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:32:58 crc kubenswrapper[4883]: I0310 09:32:58.419095 4883 generic.go:334] "Generic (PLEG): container finished" podID="07ddb6af-f2c7-46eb-aac4-fe69996caf27" containerID="2bfb6eb0197869370269e475b7e819de3708dd20b2a400d6accc4988b6fa951b" exitCode=0 Mar 10 09:32:58 crc kubenswrapper[4883]: I0310 09:32:58.419186 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" event={"ID":"07ddb6af-f2c7-46eb-aac4-fe69996caf27","Type":"ContainerDied","Data":"2bfb6eb0197869370269e475b7e819de3708dd20b2a400d6accc4988b6fa951b"} Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.764387 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.823095 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dmqb\" (UniqueName: \"kubernetes.io/projected/07ddb6af-f2c7-46eb-aac4-fe69996caf27-kube-api-access-8dmqb\") pod \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.823591 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-ssh-key-openstack-edpm-ipam\") pod \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.823745 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-inventory\") pod \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.828854 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ddb6af-f2c7-46eb-aac4-fe69996caf27-kube-api-access-8dmqb" (OuterVolumeSpecName: "kube-api-access-8dmqb") pod "07ddb6af-f2c7-46eb-aac4-fe69996caf27" (UID: "07ddb6af-f2c7-46eb-aac4-fe69996caf27"). InnerVolumeSpecName "kube-api-access-8dmqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.856690 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-inventory" (OuterVolumeSpecName: "inventory") pod "07ddb6af-f2c7-46eb-aac4-fe69996caf27" (UID: "07ddb6af-f2c7-46eb-aac4-fe69996caf27"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.859023 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "07ddb6af-f2c7-46eb-aac4-fe69996caf27" (UID: "07ddb6af-f2c7-46eb-aac4-fe69996caf27"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.925949 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dmqb\" (UniqueName: \"kubernetes.io/projected/07ddb6af-f2c7-46eb-aac4-fe69996caf27-kube-api-access-8dmqb\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.925991 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.926005 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.439773 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" event={"ID":"07ddb6af-f2c7-46eb-aac4-fe69996caf27","Type":"ContainerDied","Data":"1a759c64eb7a3ebc65aa948dfaa6cb205a6cdae4383067868312470bfcf3e829"} Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.439833 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a759c64eb7a3ebc65aa948dfaa6cb205a6cdae4383067868312470bfcf3e829" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.439844 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.512596 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp"] Mar 10 09:33:00 crc kubenswrapper[4883]: E0310 09:33:00.513129 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc74aa89-09d6-4974-a6c1-1642f6ef0a64" containerName="oc" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.513158 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc74aa89-09d6-4974-a6c1-1642f6ef0a64" containerName="oc" Mar 10 09:33:00 crc kubenswrapper[4883]: E0310 09:33:00.513174 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ddb6af-f2c7-46eb-aac4-fe69996caf27" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.513183 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ddb6af-f2c7-46eb-aac4-fe69996caf27" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.513375 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ddb6af-f2c7-46eb-aac4-fe69996caf27" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.513409 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc74aa89-09d6-4974-a6c1-1642f6ef0a64" containerName="oc" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.514116 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.515861 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.519402 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.519594 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.519687 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.521540 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp"] Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.638996 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrnx2\" (UniqueName: \"kubernetes.io/projected/20e06399-dd26-4a60-a6b7-261cc4505a92-kube-api-access-zrnx2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.639059 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.639356 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.741243 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrnx2\" (UniqueName: \"kubernetes.io/projected/20e06399-dd26-4a60-a6b7-261cc4505a92-kube-api-access-zrnx2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.741703 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.741847 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.748085 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.748395 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.757693 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrnx2\" (UniqueName: \"kubernetes.io/projected/20e06399-dd26-4a60-a6b7-261cc4505a92-kube-api-access-zrnx2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.828233 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:01 crc kubenswrapper[4883]: I0310 09:33:01.293394 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp"] Mar 10 09:33:01 crc kubenswrapper[4883]: I0310 09:33:01.448078 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" event={"ID":"20e06399-dd26-4a60-a6b7-261cc4505a92","Type":"ContainerStarted","Data":"8b05e77e3c54913dfd6f3cc0d546e79ac9a8a73933f4f22991d74e31f203e70c"} Mar 10 09:33:02 crc kubenswrapper[4883]: I0310 09:33:02.475357 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" event={"ID":"20e06399-dd26-4a60-a6b7-261cc4505a92","Type":"ContainerStarted","Data":"27e15d8be705ee66290bb5461ef69b684491fbcd39e4ef8202689d4194ff9079"} Mar 10 09:33:02 crc kubenswrapper[4883]: I0310 09:33:02.495704 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" podStartSLOduration=1.686731567 podStartE2EDuration="2.495683475s" podCreationTimestamp="2026-03-10 09:33:00 +0000 UTC" firstStartedPulling="2026-03-10 09:33:01.296336013 +0000 UTC m=+1767.551233902" lastFinishedPulling="2026-03-10 09:33:02.10528792 +0000 UTC m=+1768.360185810" observedRunningTime="2026-03-10 09:33:02.490818381 +0000 UTC m=+1768.745716270" watchObservedRunningTime="2026-03-10 09:33:02.495683475 +0000 UTC m=+1768.750581364" Mar 10 09:33:03 crc kubenswrapper[4883]: I0310 09:33:03.025941 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-jt8bs"] Mar 10 09:33:03 crc kubenswrapper[4883]: I0310 09:33:03.032165 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-jt8bs"] Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.091790 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d78560f-1b01-4ac1-9c36-109595422d78" path="/var/lib/kubelet/pods/6d78560f-1b01-4ac1-9c36-109595422d78/volumes" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.476632 4883 scope.go:117] "RemoveContainer" containerID="655c18b53a1ce432bb921c39b01b2479f6ad37de70ab46a32b33f42c98fb125b" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.506521 4883 scope.go:117] "RemoveContainer" containerID="9f05adebe53489f83df9e03cf5da9583790650f545f7218c9e2d571583c52501" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.536134 4883 scope.go:117] "RemoveContainer" containerID="a48c527e869a78aa5301ce2ab9632963d3e2d800250d247df83963b7da9be724" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.570073 4883 scope.go:117] "RemoveContainer" containerID="affbbf9bc93bb1cdc534fd16ed32d4696b867f4c70c0f6fa49bc5b18c4e55f72" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.593730 4883 scope.go:117] "RemoveContainer" containerID="c7587acba5dab37b49dbdd81924e01184e73978fd599f62b1af6671e7ae50b6e" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.620773 4883 scope.go:117] "RemoveContainer" containerID="a4591a3c1d7b1279937a98c799c83abe99429a53d01680d0592b50d1d359cd42" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.654442 4883 scope.go:117] "RemoveContainer" containerID="50e03d7f0d395af101a158c665d03ed09d9fb01ebb9c16be32b6ee2d458d2bcb" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.702795 4883 scope.go:117] "RemoveContainer" containerID="e71d6d0bfc1a16e1dac8ed8107010142581b341f08eea68db466ac02a741d979" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.738905 4883 scope.go:117] "RemoveContainer" containerID="8b81faa071a739cf8a7f25085f6d2124f3dcc3e17601b69b578e8e6f428069ce" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.758180 4883 scope.go:117] "RemoveContainer" containerID="98460e32a504c0e3ede8a9fd544c2c34e4954a1cfed507bb532c53cf560762fd" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.775660 4883 scope.go:117] "RemoveContainer" containerID="f847a3856c40dae8ee7ad4ac93ecaf18ffea7cc99907753e085972a5a353218f" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.807738 4883 scope.go:117] "RemoveContainer" containerID="5dba08c9d93be005c0c85060006f6110a86c429508b6e36e94151d58e533d961" Mar 10 09:33:06 crc kubenswrapper[4883]: I0310 09:33:06.516929 4883 generic.go:334] "Generic (PLEG): container finished" podID="20e06399-dd26-4a60-a6b7-261cc4505a92" containerID="27e15d8be705ee66290bb5461ef69b684491fbcd39e4ef8202689d4194ff9079" exitCode=0 Mar 10 09:33:06 crc kubenswrapper[4883]: I0310 09:33:06.516977 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" event={"ID":"20e06399-dd26-4a60-a6b7-261cc4505a92","Type":"ContainerDied","Data":"27e15d8be705ee66290bb5461ef69b684491fbcd39e4ef8202689d4194ff9079"} Mar 10 09:33:07 crc kubenswrapper[4883]: I0310 09:33:07.878273 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:07 crc kubenswrapper[4883]: I0310 09:33:07.897825 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-inventory\") pod \"20e06399-dd26-4a60-a6b7-261cc4505a92\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " Mar 10 09:33:07 crc kubenswrapper[4883]: I0310 09:33:07.898852 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrnx2\" (UniqueName: \"kubernetes.io/projected/20e06399-dd26-4a60-a6b7-261cc4505a92-kube-api-access-zrnx2\") pod \"20e06399-dd26-4a60-a6b7-261cc4505a92\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " Mar 10 09:33:07 crc kubenswrapper[4883]: I0310 09:33:07.898894 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-ssh-key-openstack-edpm-ipam\") pod \"20e06399-dd26-4a60-a6b7-261cc4505a92\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " Mar 10 09:33:07 crc kubenswrapper[4883]: I0310 09:33:07.924467 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e06399-dd26-4a60-a6b7-261cc4505a92-kube-api-access-zrnx2" (OuterVolumeSpecName: "kube-api-access-zrnx2") pod "20e06399-dd26-4a60-a6b7-261cc4505a92" (UID: "20e06399-dd26-4a60-a6b7-261cc4505a92"). InnerVolumeSpecName "kube-api-access-zrnx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:33:07 crc kubenswrapper[4883]: I0310 09:33:07.978595 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "20e06399-dd26-4a60-a6b7-261cc4505a92" (UID: "20e06399-dd26-4a60-a6b7-261cc4505a92"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.003774 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrnx2\" (UniqueName: \"kubernetes.io/projected/20e06399-dd26-4a60-a6b7-261cc4505a92-kube-api-access-zrnx2\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.003813 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.038638 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-inventory" (OuterVolumeSpecName: "inventory") pod "20e06399-dd26-4a60-a6b7-261cc4505a92" (UID: "20e06399-dd26-4a60-a6b7-261cc4505a92"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.106698 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.538451 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" event={"ID":"20e06399-dd26-4a60-a6b7-261cc4505a92","Type":"ContainerDied","Data":"8b05e77e3c54913dfd6f3cc0d546e79ac9a8a73933f4f22991d74e31f203e70c"} Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.538538 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b05e77e3c54913dfd6f3cc0d546e79ac9a8a73933f4f22991d74e31f203e70c" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.538552 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.589067 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5"] Mar 10 09:33:08 crc kubenswrapper[4883]: E0310 09:33:08.589500 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e06399-dd26-4a60-a6b7-261cc4505a92" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.589521 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e06399-dd26-4a60-a6b7-261cc4505a92" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.589697 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e06399-dd26-4a60-a6b7-261cc4505a92" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.590281 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.592442 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.592466 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.592791 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.593119 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.597343 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5"] Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.618250 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.618388 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.618534 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn8s4\" (UniqueName: \"kubernetes.io/projected/361b2613-f26e-45c3-aabe-9a0f115e8e10-kube-api-access-nn8s4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.720747 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.720823 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn8s4\" (UniqueName: \"kubernetes.io/projected/361b2613-f26e-45c3-aabe-9a0f115e8e10-kube-api-access-nn8s4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.720907 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.725315 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.725433 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.735576 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn8s4\" (UniqueName: \"kubernetes.io/projected/361b2613-f26e-45c3-aabe-9a0f115e8e10-kube-api-access-nn8s4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.903574 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:09 crc kubenswrapper[4883]: I0310 09:33:09.036181 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-n7f74"] Mar 10 09:33:09 crc kubenswrapper[4883]: I0310 09:33:09.043905 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-n7f74"] Mar 10 09:33:09 crc kubenswrapper[4883]: I0310 09:33:09.378143 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5"] Mar 10 09:33:09 crc kubenswrapper[4883]: I0310 09:33:09.550572 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" event={"ID":"361b2613-f26e-45c3-aabe-9a0f115e8e10","Type":"ContainerStarted","Data":"5c356be1c10709502948cd7744788e12bf33b0f2417e3251b6ef4163577344fe"} Mar 10 09:33:10 crc kubenswrapper[4883]: I0310 09:33:10.089769 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5acdd73a-9879-4507-8f6d-10e2ad8065e4" path="/var/lib/kubelet/pods/5acdd73a-9879-4507-8f6d-10e2ad8065e4/volumes" Mar 10 09:33:10 crc kubenswrapper[4883]: I0310 09:33:10.561455 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" event={"ID":"361b2613-f26e-45c3-aabe-9a0f115e8e10","Type":"ContainerStarted","Data":"1e093362aedc93f97368ec66a4b731148badab0e6dd22037fa34966ee5d3c592"} Mar 10 09:33:10 crc kubenswrapper[4883]: I0310 09:33:10.579155 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" podStartSLOduration=2.068709002 podStartE2EDuration="2.57912773s" podCreationTimestamp="2026-03-10 09:33:08 +0000 UTC" firstStartedPulling="2026-03-10 09:33:09.384962892 +0000 UTC m=+1775.639860770" lastFinishedPulling="2026-03-10 09:33:09.895381609 +0000 UTC m=+1776.150279498" observedRunningTime="2026-03-10 09:33:10.573359844 +0000 UTC m=+1776.828257734" watchObservedRunningTime="2026-03-10 09:33:10.57912773 +0000 UTC m=+1776.834025619" Mar 10 09:33:11 crc kubenswrapper[4883]: I0310 09:33:11.079858 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:33:11 crc kubenswrapper[4883]: E0310 09:33:11.080358 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:33:19 crc kubenswrapper[4883]: I0310 09:33:19.051107 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-v9wqz"] Mar 10 09:33:19 crc kubenswrapper[4883]: I0310 09:33:19.061573 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-v9wqz"] Mar 10 09:33:20 crc kubenswrapper[4883]: I0310 09:33:20.090851 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78bfcd03-74e4-4238-ae81-043bc04105cd" path="/var/lib/kubelet/pods/78bfcd03-74e4-4238-ae81-043bc04105cd/volumes" Mar 10 09:33:24 crc kubenswrapper[4883]: I0310 09:33:24.037703 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-x2hf5"] Mar 10 09:33:24 crc kubenswrapper[4883]: I0310 09:33:24.044064 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-x2hf5"] Mar 10 09:33:24 crc kubenswrapper[4883]: I0310 09:33:24.084963 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:33:24 crc kubenswrapper[4883]: E0310 09:33:24.085290 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:33:24 crc kubenswrapper[4883]: I0310 09:33:24.088936 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc0b1d9d-7834-473a-a487-6f540c606706" path="/var/lib/kubelet/pods/dc0b1d9d-7834-473a-a487-6f540c606706/volumes" Mar 10 09:33:37 crc kubenswrapper[4883]: I0310 09:33:37.080185 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:33:37 crc kubenswrapper[4883]: E0310 09:33:37.080946 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:33:38 crc kubenswrapper[4883]: I0310 09:33:38.809489 4883 generic.go:334] "Generic (PLEG): container finished" podID="361b2613-f26e-45c3-aabe-9a0f115e8e10" containerID="1e093362aedc93f97368ec66a4b731148badab0e6dd22037fa34966ee5d3c592" exitCode=0 Mar 10 09:33:38 crc kubenswrapper[4883]: I0310 09:33:38.809572 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" event={"ID":"361b2613-f26e-45c3-aabe-9a0f115e8e10","Type":"ContainerDied","Data":"1e093362aedc93f97368ec66a4b731148badab0e6dd22037fa34966ee5d3c592"} Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.279959 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.383809 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-inventory\") pod \"361b2613-f26e-45c3-aabe-9a0f115e8e10\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.384014 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-ssh-key-openstack-edpm-ipam\") pod \"361b2613-f26e-45c3-aabe-9a0f115e8e10\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.384141 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn8s4\" (UniqueName: \"kubernetes.io/projected/361b2613-f26e-45c3-aabe-9a0f115e8e10-kube-api-access-nn8s4\") pod \"361b2613-f26e-45c3-aabe-9a0f115e8e10\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.388984 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361b2613-f26e-45c3-aabe-9a0f115e8e10-kube-api-access-nn8s4" (OuterVolumeSpecName: "kube-api-access-nn8s4") pod "361b2613-f26e-45c3-aabe-9a0f115e8e10" (UID: "361b2613-f26e-45c3-aabe-9a0f115e8e10"). InnerVolumeSpecName "kube-api-access-nn8s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.409043 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-inventory" (OuterVolumeSpecName: "inventory") pod "361b2613-f26e-45c3-aabe-9a0f115e8e10" (UID: "361b2613-f26e-45c3-aabe-9a0f115e8e10"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.410788 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "361b2613-f26e-45c3-aabe-9a0f115e8e10" (UID: "361b2613-f26e-45c3-aabe-9a0f115e8e10"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.485878 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.485907 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.485919 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn8s4\" (UniqueName: \"kubernetes.io/projected/361b2613-f26e-45c3-aabe-9a0f115e8e10-kube-api-access-nn8s4\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.829361 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" event={"ID":"361b2613-f26e-45c3-aabe-9a0f115e8e10","Type":"ContainerDied","Data":"5c356be1c10709502948cd7744788e12bf33b0f2417e3251b6ef4163577344fe"} Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.829435 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c356be1c10709502948cd7744788e12bf33b0f2417e3251b6ef4163577344fe" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.829434 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.896010 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh"] Mar 10 09:33:40 crc kubenswrapper[4883]: E0310 09:33:40.896875 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361b2613-f26e-45c3-aabe-9a0f115e8e10" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.896898 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="361b2613-f26e-45c3-aabe-9a0f115e8e10" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.897125 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="361b2613-f26e-45c3-aabe-9a0f115e8e10" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.897879 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.900094 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.900519 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.900657 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.901440 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.904872 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh"] Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.996986 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djdnt\" (UniqueName: \"kubernetes.io/projected/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-kube-api-access-djdnt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.997070 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.997158 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.100237 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.101110 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djdnt\" (UniqueName: \"kubernetes.io/projected/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-kube-api-access-djdnt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.101559 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.105232 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.106052 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.118021 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djdnt\" (UniqueName: \"kubernetes.io/projected/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-kube-api-access-djdnt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.212912 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.676960 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh"] Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.838746 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" event={"ID":"269dd9c8-3d75-4892-9f75-c4fe1b9093b8","Type":"ContainerStarted","Data":"8303a8bf083bd2d991e220d511db2004161e87ee602ae2da6f0bac8a22dc1f07"} Mar 10 09:33:42 crc kubenswrapper[4883]: I0310 09:33:42.856939 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" event={"ID":"269dd9c8-3d75-4892-9f75-c4fe1b9093b8","Type":"ContainerStarted","Data":"7062c75f5b57df7b7f1570cd08c40c2a3f71aa8c69bb0f4ca90d1b87f910e784"} Mar 10 09:33:42 crc kubenswrapper[4883]: I0310 09:33:42.879329 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" podStartSLOduration=2.127707916 podStartE2EDuration="2.879312129s" podCreationTimestamp="2026-03-10 09:33:40 +0000 UTC" firstStartedPulling="2026-03-10 09:33:41.681365831 +0000 UTC m=+1807.936263710" lastFinishedPulling="2026-03-10 09:33:42.432970033 +0000 UTC m=+1808.687867923" observedRunningTime="2026-03-10 09:33:42.86928366 +0000 UTC m=+1809.124181549" watchObservedRunningTime="2026-03-10 09:33:42.879312129 +0000 UTC m=+1809.134210008" Mar 10 09:33:48 crc kubenswrapper[4883]: I0310 09:33:48.080565 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:33:48 crc kubenswrapper[4883]: E0310 09:33:48.081700 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:33:59 crc kubenswrapper[4883]: I0310 09:33:59.034548 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-l9ldx"] Mar 10 09:33:59 crc kubenswrapper[4883]: I0310 09:33:59.039104 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-l9ldx"] Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.022559 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f74b-account-create-update-lsxls"] Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.030838 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f74b-account-create-update-lsxls"] Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.091680 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d355ddcd-9120-4436-84c4-928027e6ee33" path="/var/lib/kubelet/pods/d355ddcd-9120-4436-84c4-928027e6ee33/volumes" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.092298 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" path="/var/lib/kubelet/pods/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8/volumes" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.144317 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552254-d9q7p"] Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.146174 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.148778 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.148835 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.152711 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.154628 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552254-d9q7p"] Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.321984 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lds9l\" (UniqueName: \"kubernetes.io/projected/b477b90a-75af-4621-8c33-21fdd8c9c749-kube-api-access-lds9l\") pod \"auto-csr-approver-29552254-d9q7p\" (UID: \"b477b90a-75af-4621-8c33-21fdd8c9c749\") " pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.425126 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lds9l\" (UniqueName: \"kubernetes.io/projected/b477b90a-75af-4621-8c33-21fdd8c9c749-kube-api-access-lds9l\") pod \"auto-csr-approver-29552254-d9q7p\" (UID: \"b477b90a-75af-4621-8c33-21fdd8c9c749\") " pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.444711 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lds9l\" (UniqueName: \"kubernetes.io/projected/b477b90a-75af-4621-8c33-21fdd8c9c749-kube-api-access-lds9l\") pod \"auto-csr-approver-29552254-d9q7p\" (UID: \"b477b90a-75af-4621-8c33-21fdd8c9c749\") " pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.470812 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.878712 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552254-d9q7p"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.012990 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" event={"ID":"b477b90a-75af-4621-8c33-21fdd8c9c749","Type":"ContainerStarted","Data":"9c061b891dab6d5107f249b527e3bc76699e556e53745a31ae035535d96ba2dc"} Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.045427 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-zr486"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.057543 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c052-account-create-update-hg4pd"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.068062 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4vxd6"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.076972 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-15d6-account-create-update-lwjcj"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.096064 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-zr486"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.106982 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-15d6-account-create-update-lwjcj"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.113983 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4vxd6"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.119112 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c052-account-create-update-hg4pd"] Mar 10 09:34:02 crc kubenswrapper[4883]: I0310 09:34:02.089725 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e39def71-60ef-4b2a-823b-1c5e89e02647" path="/var/lib/kubelet/pods/e39def71-60ef-4b2a-823b-1c5e89e02647/volumes" Mar 10 09:34:02 crc kubenswrapper[4883]: I0310 09:34:02.090577 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9dd286b-6aa5-4525-a645-8e4ec79af348" path="/var/lib/kubelet/pods/e9dd286b-6aa5-4525-a645-8e4ec79af348/volumes" Mar 10 09:34:02 crc kubenswrapper[4883]: I0310 09:34:02.091043 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" path="/var/lib/kubelet/pods/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9/volumes" Mar 10 09:34:02 crc kubenswrapper[4883]: I0310 09:34:02.091532 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdbd0859-6f93-4118-9e5b-2170ec3d43ad" path="/var/lib/kubelet/pods/fdbd0859-6f93-4118-9e5b-2170ec3d43ad/volumes" Mar 10 09:34:03 crc kubenswrapper[4883]: I0310 09:34:03.030942 4883 generic.go:334] "Generic (PLEG): container finished" podID="b477b90a-75af-4621-8c33-21fdd8c9c749" containerID="6864877f7abf0513eaa87f372fd6fb5c7baab57f240f7c0cd19def879aaf0dc8" exitCode=0 Mar 10 09:34:03 crc kubenswrapper[4883]: I0310 09:34:03.030993 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" event={"ID":"b477b90a-75af-4621-8c33-21fdd8c9c749","Type":"ContainerDied","Data":"6864877f7abf0513eaa87f372fd6fb5c7baab57f240f7c0cd19def879aaf0dc8"} Mar 10 09:34:03 crc kubenswrapper[4883]: I0310 09:34:03.079541 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:34:03 crc kubenswrapper[4883]: E0310 09:34:03.079908 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:34:04 crc kubenswrapper[4883]: I0310 09:34:04.327023 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:04 crc kubenswrapper[4883]: I0310 09:34:04.403295 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lds9l\" (UniqueName: \"kubernetes.io/projected/b477b90a-75af-4621-8c33-21fdd8c9c749-kube-api-access-lds9l\") pod \"b477b90a-75af-4621-8c33-21fdd8c9c749\" (UID: \"b477b90a-75af-4621-8c33-21fdd8c9c749\") " Mar 10 09:34:04 crc kubenswrapper[4883]: I0310 09:34:04.409653 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b477b90a-75af-4621-8c33-21fdd8c9c749-kube-api-access-lds9l" (OuterVolumeSpecName: "kube-api-access-lds9l") pod "b477b90a-75af-4621-8c33-21fdd8c9c749" (UID: "b477b90a-75af-4621-8c33-21fdd8c9c749"). InnerVolumeSpecName "kube-api-access-lds9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:34:04 crc kubenswrapper[4883]: I0310 09:34:04.506337 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lds9l\" (UniqueName: \"kubernetes.io/projected/b477b90a-75af-4621-8c33-21fdd8c9c749-kube-api-access-lds9l\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.003591 4883 scope.go:117] "RemoveContainer" containerID="d23bd9bd934e3d99aa1bfc6ccd981d2c32f179368b140bc3665f8538d2c19637" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.026742 4883 scope.go:117] "RemoveContainer" containerID="447fb8e21bbb8bc037b2913c14d7e94664156b41e117d80c9e5a7c41f589d745" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.050838 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" event={"ID":"b477b90a-75af-4621-8c33-21fdd8c9c749","Type":"ContainerDied","Data":"9c061b891dab6d5107f249b527e3bc76699e556e53745a31ae035535d96ba2dc"} Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.050887 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c061b891dab6d5107f249b527e3bc76699e556e53745a31ae035535d96ba2dc" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.051214 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.070993 4883 scope.go:117] "RemoveContainer" containerID="edddf942ff54cf02d31c8d37d1a93a850752455b76c3f9b8d5acabfd5e985820" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.093295 4883 scope.go:117] "RemoveContainer" containerID="e1764b4c527a37975cc4e8ef8e724e3a447bf2698dc21042ec02c05dc2aa830c" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.127345 4883 scope.go:117] "RemoveContainer" containerID="d00a8e52c275926dd98e27f733cab6f0434bb178955532902b15a5e7ab5d08f8" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.146920 4883 scope.go:117] "RemoveContainer" containerID="911a3a55df750ade4baf157ba45ca477442b34e3cdb85095ba2fd8c7ca80b8fe" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.164297 4883 scope.go:117] "RemoveContainer" containerID="0058e7bce226cca970138ded52e080821de3de3f9381dfb13e4eee1dea226cbf" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.195665 4883 scope.go:117] "RemoveContainer" containerID="607df7cc400848b9303988905dd099e3a6ed13423fde585243a8fdb269315664" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.232026 4883 scope.go:117] "RemoveContainer" containerID="ca65d438bf33bf28ae2760bd85500809892b5530a19b9249afc6d3b1e6b1e723" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.388634 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552248-gvmzc"] Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.396232 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552248-gvmzc"] Mar 10 09:34:06 crc kubenswrapper[4883]: I0310 09:34:06.089339 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce" path="/var/lib/kubelet/pods/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce/volumes" Mar 10 09:34:15 crc kubenswrapper[4883]: I0310 09:34:15.080443 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:34:15 crc kubenswrapper[4883]: E0310 09:34:15.081385 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:34:19 crc kubenswrapper[4883]: I0310 09:34:19.179511 4883 generic.go:334] "Generic (PLEG): container finished" podID="269dd9c8-3d75-4892-9f75-c4fe1b9093b8" containerID="7062c75f5b57df7b7f1570cd08c40c2a3f71aa8c69bb0f4ca90d1b87f910e784" exitCode=0 Mar 10 09:34:19 crc kubenswrapper[4883]: I0310 09:34:19.179588 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" event={"ID":"269dd9c8-3d75-4892-9f75-c4fe1b9093b8","Type":"ContainerDied","Data":"7062c75f5b57df7b7f1570cd08c40c2a3f71aa8c69bb0f4ca90d1b87f910e784"} Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.542844 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.727087 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djdnt\" (UniqueName: \"kubernetes.io/projected/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-kube-api-access-djdnt\") pod \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.727341 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-inventory\") pod \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.727638 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-ssh-key-openstack-edpm-ipam\") pod \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.733625 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-kube-api-access-djdnt" (OuterVolumeSpecName: "kube-api-access-djdnt") pod "269dd9c8-3d75-4892-9f75-c4fe1b9093b8" (UID: "269dd9c8-3d75-4892-9f75-c4fe1b9093b8"). InnerVolumeSpecName "kube-api-access-djdnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.753834 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-inventory" (OuterVolumeSpecName: "inventory") pod "269dd9c8-3d75-4892-9f75-c4fe1b9093b8" (UID: "269dd9c8-3d75-4892-9f75-c4fe1b9093b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.754435 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "269dd9c8-3d75-4892-9f75-c4fe1b9093b8" (UID: "269dd9c8-3d75-4892-9f75-c4fe1b9093b8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.831154 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djdnt\" (UniqueName: \"kubernetes.io/projected/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-kube-api-access-djdnt\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.831194 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.831210 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.209799 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" event={"ID":"269dd9c8-3d75-4892-9f75-c4fe1b9093b8","Type":"ContainerDied","Data":"8303a8bf083bd2d991e220d511db2004161e87ee602ae2da6f0bac8a22dc1f07"} Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.211811 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8303a8bf083bd2d991e220d511db2004161e87ee602ae2da6f0bac8a22dc1f07" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.211957 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.268709 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v5v84"] Mar 10 09:34:21 crc kubenswrapper[4883]: E0310 09:34:21.269238 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="269dd9c8-3d75-4892-9f75-c4fe1b9093b8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.269260 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="269dd9c8-3d75-4892-9f75-c4fe1b9093b8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:21 crc kubenswrapper[4883]: E0310 09:34:21.269319 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b477b90a-75af-4621-8c33-21fdd8c9c749" containerName="oc" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.269327 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="b477b90a-75af-4621-8c33-21fdd8c9c749" containerName="oc" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.269590 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="269dd9c8-3d75-4892-9f75-c4fe1b9093b8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.269621 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="b477b90a-75af-4621-8c33-21fdd8c9c749" containerName="oc" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.270523 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.272975 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.273148 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.273175 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.273176 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.278152 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v5v84"] Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.344638 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhnxh\" (UniqueName: \"kubernetes.io/projected/caa69332-97ab-4629-900f-1596af363ba4-kube-api-access-nhnxh\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.344712 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.344876 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.447294 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.447490 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.447639 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnxh\" (UniqueName: \"kubernetes.io/projected/caa69332-97ab-4629-900f-1596af363ba4-kube-api-access-nhnxh\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.452378 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.453936 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.464320 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhnxh\" (UniqueName: \"kubernetes.io/projected/caa69332-97ab-4629-900f-1596af363ba4-kube-api-access-nhnxh\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.584083 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:22 crc kubenswrapper[4883]: I0310 09:34:22.050970 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v5v84"] Mar 10 09:34:22 crc kubenswrapper[4883]: I0310 09:34:22.219177 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" event={"ID":"caa69332-97ab-4629-900f-1596af363ba4","Type":"ContainerStarted","Data":"aa00dd959f63d750c79bf5b931b12bc09736bfd10221fd826891011af80ca166"} Mar 10 09:34:23 crc kubenswrapper[4883]: I0310 09:34:23.042998 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pghj7"] Mar 10 09:34:23 crc kubenswrapper[4883]: I0310 09:34:23.052998 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pghj7"] Mar 10 09:34:23 crc kubenswrapper[4883]: I0310 09:34:23.231747 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" event={"ID":"caa69332-97ab-4629-900f-1596af363ba4","Type":"ContainerStarted","Data":"fb01332e612ee8643c9cd5071e07447d36df50055a15b4afd7c545e1c1d03333"} Mar 10 09:34:23 crc kubenswrapper[4883]: I0310 09:34:23.256943 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" podStartSLOduration=1.626030222 podStartE2EDuration="2.256921721s" podCreationTimestamp="2026-03-10 09:34:21 +0000 UTC" firstStartedPulling="2026-03-10 09:34:22.054412209 +0000 UTC m=+1848.309310098" lastFinishedPulling="2026-03-10 09:34:22.685303708 +0000 UTC m=+1848.940201597" observedRunningTime="2026-03-10 09:34:23.247361505 +0000 UTC m=+1849.502259393" watchObservedRunningTime="2026-03-10 09:34:23.256921721 +0000 UTC m=+1849.511819610" Mar 10 09:34:24 crc kubenswrapper[4883]: I0310 09:34:24.089741 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46c8e962-9007-49e1-bd9f-d822e9100291" path="/var/lib/kubelet/pods/46c8e962-9007-49e1-bd9f-d822e9100291/volumes" Mar 10 09:34:28 crc kubenswrapper[4883]: I0310 09:34:28.274365 4883 generic.go:334] "Generic (PLEG): container finished" podID="caa69332-97ab-4629-900f-1596af363ba4" containerID="fb01332e612ee8643c9cd5071e07447d36df50055a15b4afd7c545e1c1d03333" exitCode=0 Mar 10 09:34:28 crc kubenswrapper[4883]: I0310 09:34:28.274462 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" event={"ID":"caa69332-97ab-4629-900f-1596af363ba4","Type":"ContainerDied","Data":"fb01332e612ee8643c9cd5071e07447d36df50055a15b4afd7c545e1c1d03333"} Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.617873 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.805453 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-ssh-key-openstack-edpm-ipam\") pod \"caa69332-97ab-4629-900f-1596af363ba4\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.805760 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhnxh\" (UniqueName: \"kubernetes.io/projected/caa69332-97ab-4629-900f-1596af363ba4-kube-api-access-nhnxh\") pod \"caa69332-97ab-4629-900f-1596af363ba4\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.805855 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-inventory-0\") pod \"caa69332-97ab-4629-900f-1596af363ba4\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.811775 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa69332-97ab-4629-900f-1596af363ba4-kube-api-access-nhnxh" (OuterVolumeSpecName: "kube-api-access-nhnxh") pod "caa69332-97ab-4629-900f-1596af363ba4" (UID: "caa69332-97ab-4629-900f-1596af363ba4"). InnerVolumeSpecName "kube-api-access-nhnxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.829675 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "caa69332-97ab-4629-900f-1596af363ba4" (UID: "caa69332-97ab-4629-900f-1596af363ba4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.831866 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "caa69332-97ab-4629-900f-1596af363ba4" (UID: "caa69332-97ab-4629-900f-1596af363ba4"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.908957 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.908990 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhnxh\" (UniqueName: \"kubernetes.io/projected/caa69332-97ab-4629-900f-1596af363ba4-kube-api-access-nhnxh\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.909001 4883 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.079869 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:34:30 crc kubenswrapper[4883]: E0310 09:34:30.080331 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.292949 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" event={"ID":"caa69332-97ab-4629-900f-1596af363ba4","Type":"ContainerDied","Data":"aa00dd959f63d750c79bf5b931b12bc09736bfd10221fd826891011af80ca166"} Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.293003 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa00dd959f63d750c79bf5b931b12bc09736bfd10221fd826891011af80ca166" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.293005 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.358659 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc"] Mar 10 09:34:30 crc kubenswrapper[4883]: E0310 09:34:30.359058 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa69332-97ab-4629-900f-1596af363ba4" containerName="ssh-known-hosts-edpm-deployment" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.359080 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa69332-97ab-4629-900f-1596af363ba4" containerName="ssh-known-hosts-edpm-deployment" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.359264 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa69332-97ab-4629-900f-1596af363ba4" containerName="ssh-known-hosts-edpm-deployment" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.359895 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.362319 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.362547 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.362736 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.363684 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.373981 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc"] Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.418257 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swnx4\" (UniqueName: \"kubernetes.io/projected/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-kube-api-access-swnx4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.418348 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.418570 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.519820 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.519892 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swnx4\" (UniqueName: \"kubernetes.io/projected/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-kube-api-access-swnx4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.519944 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.525161 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.525964 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.536866 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swnx4\" (UniqueName: \"kubernetes.io/projected/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-kube-api-access-swnx4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.673955 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:31 crc kubenswrapper[4883]: I0310 09:34:31.133546 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc"] Mar 10 09:34:31 crc kubenswrapper[4883]: I0310 09:34:31.301417 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" event={"ID":"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b","Type":"ContainerStarted","Data":"d5f0e6ed073a3d1de2b6b73203aecce347baf4d6a677d985db4e33c19ce15970"} Mar 10 09:34:32 crc kubenswrapper[4883]: I0310 09:34:32.310941 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" event={"ID":"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b","Type":"ContainerStarted","Data":"4c819d4de3b0b257ea9ca3015be282299cdfba8979a472bd1fbf9b9d67be7933"} Mar 10 09:34:32 crc kubenswrapper[4883]: I0310 09:34:32.341741 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" podStartSLOduration=1.846960975 podStartE2EDuration="2.341715335s" podCreationTimestamp="2026-03-10 09:34:30 +0000 UTC" firstStartedPulling="2026-03-10 09:34:31.14128878 +0000 UTC m=+1857.396186669" lastFinishedPulling="2026-03-10 09:34:31.63604314 +0000 UTC m=+1857.890941029" observedRunningTime="2026-03-10 09:34:32.323316562 +0000 UTC m=+1858.578214451" watchObservedRunningTime="2026-03-10 09:34:32.341715335 +0000 UTC m=+1858.596613224" Mar 10 09:34:38 crc kubenswrapper[4883]: I0310 09:34:38.024621 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s66f5"] Mar 10 09:34:38 crc kubenswrapper[4883]: I0310 09:34:38.030284 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s66f5"] Mar 10 09:34:38 crc kubenswrapper[4883]: I0310 09:34:38.087443 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" path="/var/lib/kubelet/pods/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05/volumes" Mar 10 09:34:38 crc kubenswrapper[4883]: I0310 09:34:38.361208 4883 generic.go:334] "Generic (PLEG): container finished" podID="61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" containerID="4c819d4de3b0b257ea9ca3015be282299cdfba8979a472bd1fbf9b9d67be7933" exitCode=0 Mar 10 09:34:38 crc kubenswrapper[4883]: I0310 09:34:38.361260 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" event={"ID":"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b","Type":"ContainerDied","Data":"4c819d4de3b0b257ea9ca3015be282299cdfba8979a472bd1fbf9b9d67be7933"} Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.700879 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.899260 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-ssh-key-openstack-edpm-ipam\") pod \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.899391 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swnx4\" (UniqueName: \"kubernetes.io/projected/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-kube-api-access-swnx4\") pod \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.899508 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-inventory\") pod \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.905119 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-kube-api-access-swnx4" (OuterVolumeSpecName: "kube-api-access-swnx4") pod "61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" (UID: "61bb4cc5-1d4f-4439-a00e-4b2e27d4802b"). InnerVolumeSpecName "kube-api-access-swnx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.921095 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" (UID: "61bb4cc5-1d4f-4439-a00e-4b2e27d4802b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.921570 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-inventory" (OuterVolumeSpecName: "inventory") pod "61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" (UID: "61bb4cc5-1d4f-4439-a00e-4b2e27d4802b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.002066 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.002098 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swnx4\" (UniqueName: \"kubernetes.io/projected/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-kube-api-access-swnx4\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.002109 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.024043 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-rptkb"] Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.029895 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-rptkb"] Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.089388 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d3a7934-1ab2-4013-b3ff-90859ffcc179" path="/var/lib/kubelet/pods/3d3a7934-1ab2-4013-b3ff-90859ffcc179/volumes" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.377159 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" event={"ID":"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b","Type":"ContainerDied","Data":"d5f0e6ed073a3d1de2b6b73203aecce347baf4d6a677d985db4e33c19ce15970"} Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.377212 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5f0e6ed073a3d1de2b6b73203aecce347baf4d6a677d985db4e33c19ce15970" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.377268 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.432069 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz"] Mar 10 09:34:40 crc kubenswrapper[4883]: E0310 09:34:40.432671 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.432691 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.432932 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.433730 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.437993 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.438106 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz"] Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.438943 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.439170 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.439222 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.612384 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.612597 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.612661 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqjd7\" (UniqueName: \"kubernetes.io/projected/0efdf39d-2133-4aaf-9fec-2b50533d3cae-kube-api-access-wqjd7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.714054 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.714131 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqjd7\" (UniqueName: \"kubernetes.io/projected/0efdf39d-2133-4aaf-9fec-2b50533d3cae-kube-api-access-wqjd7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.714187 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.719164 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.719788 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.728950 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqjd7\" (UniqueName: \"kubernetes.io/projected/0efdf39d-2133-4aaf-9fec-2b50533d3cae-kube-api-access-wqjd7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.752988 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:41 crc kubenswrapper[4883]: I0310 09:34:41.080149 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:34:41 crc kubenswrapper[4883]: E0310 09:34:41.080422 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:34:41 crc kubenswrapper[4883]: I0310 09:34:41.208613 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz"] Mar 10 09:34:41 crc kubenswrapper[4883]: I0310 09:34:41.387995 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" event={"ID":"0efdf39d-2133-4aaf-9fec-2b50533d3cae","Type":"ContainerStarted","Data":"16f2e4b6a9b14352515d1b370e2f4f84e250f6963b28fb04190abdc0181c01d6"} Mar 10 09:34:42 crc kubenswrapper[4883]: I0310 09:34:42.396954 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" event={"ID":"0efdf39d-2133-4aaf-9fec-2b50533d3cae","Type":"ContainerStarted","Data":"73f1da7ddd7a009338a4315bdefbd88cf56f53350b85e5696a99a4463ab74afe"} Mar 10 09:34:42 crc kubenswrapper[4883]: I0310 09:34:42.414108 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" podStartSLOduration=1.9080569760000001 podStartE2EDuration="2.414088131s" podCreationTimestamp="2026-03-10 09:34:40 +0000 UTC" firstStartedPulling="2026-03-10 09:34:41.213882132 +0000 UTC m=+1867.468780021" lastFinishedPulling="2026-03-10 09:34:41.719913287 +0000 UTC m=+1867.974811176" observedRunningTime="2026-03-10 09:34:42.412450655 +0000 UTC m=+1868.667348543" watchObservedRunningTime="2026-03-10 09:34:42.414088131 +0000 UTC m=+1868.668986021" Mar 10 09:34:49 crc kubenswrapper[4883]: I0310 09:34:49.454319 4883 generic.go:334] "Generic (PLEG): container finished" podID="0efdf39d-2133-4aaf-9fec-2b50533d3cae" containerID="73f1da7ddd7a009338a4315bdefbd88cf56f53350b85e5696a99a4463ab74afe" exitCode=0 Mar 10 09:34:49 crc kubenswrapper[4883]: I0310 09:34:49.454401 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" event={"ID":"0efdf39d-2133-4aaf-9fec-2b50533d3cae","Type":"ContainerDied","Data":"73f1da7ddd7a009338a4315bdefbd88cf56f53350b85e5696a99a4463ab74afe"} Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.761867 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.903676 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqjd7\" (UniqueName: \"kubernetes.io/projected/0efdf39d-2133-4aaf-9fec-2b50533d3cae-kube-api-access-wqjd7\") pod \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.903737 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-ssh-key-openstack-edpm-ipam\") pod \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.903764 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-inventory\") pod \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.910002 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0efdf39d-2133-4aaf-9fec-2b50533d3cae-kube-api-access-wqjd7" (OuterVolumeSpecName: "kube-api-access-wqjd7") pod "0efdf39d-2133-4aaf-9fec-2b50533d3cae" (UID: "0efdf39d-2133-4aaf-9fec-2b50533d3cae"). InnerVolumeSpecName "kube-api-access-wqjd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.928740 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-inventory" (OuterVolumeSpecName: "inventory") pod "0efdf39d-2133-4aaf-9fec-2b50533d3cae" (UID: "0efdf39d-2133-4aaf-9fec-2b50533d3cae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.929253 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0efdf39d-2133-4aaf-9fec-2b50533d3cae" (UID: "0efdf39d-2133-4aaf-9fec-2b50533d3cae"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.006156 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqjd7\" (UniqueName: \"kubernetes.io/projected/0efdf39d-2133-4aaf-9fec-2b50533d3cae-kube-api-access-wqjd7\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.006191 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.006204 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.470757 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" event={"ID":"0efdf39d-2133-4aaf-9fec-2b50533d3cae","Type":"ContainerDied","Data":"16f2e4b6a9b14352515d1b370e2f4f84e250f6963b28fb04190abdc0181c01d6"} Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.471250 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16f2e4b6a9b14352515d1b370e2f4f84e250f6963b28fb04190abdc0181c01d6" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.470950 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.536762 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9"] Mar 10 09:34:51 crc kubenswrapper[4883]: E0310 09:34:51.537154 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0efdf39d-2133-4aaf-9fec-2b50533d3cae" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.537173 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0efdf39d-2133-4aaf-9fec-2b50533d3cae" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.537353 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0efdf39d-2133-4aaf-9fec-2b50533d3cae" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.538032 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.541555 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.541631 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.542047 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.542107 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.542179 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.542183 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.542240 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.542239 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.548280 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9"] Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720198 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720255 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720427 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720554 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720608 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6tzv\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-kube-api-access-t6tzv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720689 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720764 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720799 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720871 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720922 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720952 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.721011 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.721092 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.721203 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.821967 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822004 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822040 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822070 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822088 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822116 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822154 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822200 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822247 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822271 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822295 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822317 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822333 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6tzv\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-kube-api-access-t6tzv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822363 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.826595 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.826691 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.827014 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.827642 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.828074 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.828896 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.828975 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.828996 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.829055 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.830081 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.830461 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.830712 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.832030 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.843048 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6tzv\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-kube-api-access-t6tzv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.851638 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:52 crc kubenswrapper[4883]: I0310 09:34:52.297422 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9"] Mar 10 09:34:52 crc kubenswrapper[4883]: I0310 09:34:52.481335 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" event={"ID":"7e9f7531-37e1-4284-94ac-cada3d2fc301","Type":"ContainerStarted","Data":"127055f8bdc0041304754e84a763bfae34d394957e46e16446f33cdbc93502be"} Mar 10 09:34:53 crc kubenswrapper[4883]: I0310 09:34:53.491430 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" event={"ID":"7e9f7531-37e1-4284-94ac-cada3d2fc301","Type":"ContainerStarted","Data":"2cd02ede66a5fe79061c1a6091f99f4680432d2ba4ea8cd1a7417070b12939f8"} Mar 10 09:34:53 crc kubenswrapper[4883]: I0310 09:34:53.519607 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" podStartSLOduration=2.021470864 podStartE2EDuration="2.519580421s" podCreationTimestamp="2026-03-10 09:34:51 +0000 UTC" firstStartedPulling="2026-03-10 09:34:52.303328646 +0000 UTC m=+1878.558226535" lastFinishedPulling="2026-03-10 09:34:52.801438204 +0000 UTC m=+1879.056336092" observedRunningTime="2026-03-10 09:34:53.51062245 +0000 UTC m=+1879.765520340" watchObservedRunningTime="2026-03-10 09:34:53.519580421 +0000 UTC m=+1879.774478310" Mar 10 09:34:55 crc kubenswrapper[4883]: I0310 09:34:55.080598 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:34:55 crc kubenswrapper[4883]: I0310 09:34:55.512817 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"5c1139e014fa94ac8c68493a0442f3c82dbfbb40d973771c846961471e7e8c0d"} Mar 10 09:34:56 crc kubenswrapper[4883]: I0310 09:34:56.037115 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-j9wml"] Mar 10 09:34:56 crc kubenswrapper[4883]: I0310 09:34:56.043453 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-j9wml"] Mar 10 09:34:56 crc kubenswrapper[4883]: I0310 09:34:56.089200 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf096652-ae85-4c98-8821-cd47eafae98f" path="/var/lib/kubelet/pods/cf096652-ae85-4c98-8821-cd47eafae98f/volumes" Mar 10 09:35:05 crc kubenswrapper[4883]: I0310 09:35:05.404360 4883 scope.go:117] "RemoveContainer" containerID="bec7039a9730daef00f6f750a9ad04bbfacd1a82b7aa8fa9152df9b60b9ae6d2" Mar 10 09:35:05 crc kubenswrapper[4883]: I0310 09:35:05.444246 4883 scope.go:117] "RemoveContainer" containerID="354a3c28cf873c296da4850a04f59db6b94dbed1b8ba2c5447e9a4525bd40aed" Mar 10 09:35:05 crc kubenswrapper[4883]: I0310 09:35:05.479456 4883 scope.go:117] "RemoveContainer" containerID="9cbbbfbaa746758ec604b741e62dd7f69a1ec3688b2c70a13290cb0a5d5252e8" Mar 10 09:35:05 crc kubenswrapper[4883]: I0310 09:35:05.511662 4883 scope.go:117] "RemoveContainer" containerID="1b941b684a31273444bcbcfa6618c7c4de8354efb411f75cdb5b7cfd833281c3" Mar 10 09:35:05 crc kubenswrapper[4883]: I0310 09:35:05.557389 4883 scope.go:117] "RemoveContainer" containerID="996ee60e7ca18498ccd88f148bbd61209a4f5cc8500909482eda8135e24f983e" Mar 10 09:35:19 crc kubenswrapper[4883]: I0310 09:35:19.745485 4883 generic.go:334] "Generic (PLEG): container finished" podID="7e9f7531-37e1-4284-94ac-cada3d2fc301" containerID="2cd02ede66a5fe79061c1a6091f99f4680432d2ba4ea8cd1a7417070b12939f8" exitCode=0 Mar 10 09:35:19 crc kubenswrapper[4883]: I0310 09:35:19.745561 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" event={"ID":"7e9f7531-37e1-4284-94ac-cada3d2fc301","Type":"ContainerDied","Data":"2cd02ede66a5fe79061c1a6091f99f4680432d2ba4ea8cd1a7417070b12939f8"} Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.099367 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251202 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-inventory\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251261 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ovn-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251336 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-repo-setup-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251368 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251439 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251470 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-neutron-metadata-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251512 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-telemetry-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251537 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-libvirt-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251564 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ssh-key-openstack-edpm-ipam\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251588 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-bootstrap-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251614 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6tzv\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-kube-api-access-t6tzv\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251682 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251706 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-nova-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251735 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.259924 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.260043 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.260975 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.261381 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.261959 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.262118 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.262220 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.262246 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-kube-api-access-t6tzv" (OuterVolumeSpecName: "kube-api-access-t6tzv") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "kube-api-access-t6tzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.262508 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.262720 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.262839 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.264792 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.281229 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-inventory" (OuterVolumeSpecName: "inventory") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.283737 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353521 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353558 4883 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353572 4883 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353583 4883 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353596 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353606 4883 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353617 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6tzv\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-kube-api-access-t6tzv\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353627 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353636 4883 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353657 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353669 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353680 4883 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353692 4883 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353702 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.766389 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" event={"ID":"7e9f7531-37e1-4284-94ac-cada3d2fc301","Type":"ContainerDied","Data":"127055f8bdc0041304754e84a763bfae34d394957e46e16446f33cdbc93502be"} Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.766802 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="127055f8bdc0041304754e84a763bfae34d394957e46e16446f33cdbc93502be" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.766506 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.910766 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz"] Mar 10 09:35:21 crc kubenswrapper[4883]: E0310 09:35:21.911203 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9f7531-37e1-4284-94ac-cada3d2fc301" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.911223 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9f7531-37e1-4284-94ac-cada3d2fc301" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.911381 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e9f7531-37e1-4284-94ac-cada3d2fc301" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.912102 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.913540 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.913678 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.913852 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.914032 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.915571 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.919505 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz"] Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.065227 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.065334 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmfbp\" (UniqueName: \"kubernetes.io/projected/bbcde384-73a5-48c3-a5fb-226d671707cb-kube-api-access-tmfbp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.065376 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/bbcde384-73a5-48c3-a5fb-226d671707cb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.065432 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.065487 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.166719 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmfbp\" (UniqueName: \"kubernetes.io/projected/bbcde384-73a5-48c3-a5fb-226d671707cb-kube-api-access-tmfbp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.166792 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/bbcde384-73a5-48c3-a5fb-226d671707cb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.166845 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.166890 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.166950 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.168720 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/bbcde384-73a5-48c3-a5fb-226d671707cb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.172533 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.173737 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.174930 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.184709 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmfbp\" (UniqueName: \"kubernetes.io/projected/bbcde384-73a5-48c3-a5fb-226d671707cb-kube-api-access-tmfbp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.231937 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.693417 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz"] Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.698279 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.774983 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" event={"ID":"bbcde384-73a5-48c3-a5fb-226d671707cb","Type":"ContainerStarted","Data":"27fed5b683575c0a08b954a9ae7d46ae8f20e461778e70a56e2493bbed622009"} Mar 10 09:35:23 crc kubenswrapper[4883]: I0310 09:35:23.789190 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" event={"ID":"bbcde384-73a5-48c3-a5fb-226d671707cb","Type":"ContainerStarted","Data":"1d006b59011e7940ba2ae5a4022657be65cc4bdf6115e6c741e5f6a8f0eb1eeb"} Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.129556 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" podStartSLOduration=38.593123913 podStartE2EDuration="39.129532487s" podCreationTimestamp="2026-03-10 09:35:21 +0000 UTC" firstStartedPulling="2026-03-10 09:35:22.698043881 +0000 UTC m=+1908.952941771" lastFinishedPulling="2026-03-10 09:35:23.234452456 +0000 UTC m=+1909.489350345" observedRunningTime="2026-03-10 09:35:23.815576056 +0000 UTC m=+1910.070473935" watchObservedRunningTime="2026-03-10 09:36:00.129532487 +0000 UTC m=+1946.384430375" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.138342 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552256-fmqzj"] Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.139836 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.142025 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.143284 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.143426 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.157820 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552256-fmqzj"] Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.335001 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z5bj\" (UniqueName: \"kubernetes.io/projected/c70e8b0b-51ad-4080-8955-8aa8ee68f274-kube-api-access-6z5bj\") pod \"auto-csr-approver-29552256-fmqzj\" (UID: \"c70e8b0b-51ad-4080-8955-8aa8ee68f274\") " pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.438262 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z5bj\" (UniqueName: \"kubernetes.io/projected/c70e8b0b-51ad-4080-8955-8aa8ee68f274-kube-api-access-6z5bj\") pod \"auto-csr-approver-29552256-fmqzj\" (UID: \"c70e8b0b-51ad-4080-8955-8aa8ee68f274\") " pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.458440 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z5bj\" (UniqueName: \"kubernetes.io/projected/c70e8b0b-51ad-4080-8955-8aa8ee68f274-kube-api-access-6z5bj\") pod \"auto-csr-approver-29552256-fmqzj\" (UID: \"c70e8b0b-51ad-4080-8955-8aa8ee68f274\") " pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.460134 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.858441 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552256-fmqzj"] Mar 10 09:36:01 crc kubenswrapper[4883]: I0310 09:36:01.120383 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" event={"ID":"c70e8b0b-51ad-4080-8955-8aa8ee68f274","Type":"ContainerStarted","Data":"7f943ddc9f27bb461c68b30cb9b1933997a1a124f599f94f92475364a5e85cb2"} Mar 10 09:36:02 crc kubenswrapper[4883]: I0310 09:36:02.131533 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" event={"ID":"c70e8b0b-51ad-4080-8955-8aa8ee68f274","Type":"ContainerStarted","Data":"f3c4bd29935ee27cc39cdc84c3f636e36dc2e8b9c1bfc85b30473304c9e1a2ff"} Mar 10 09:36:02 crc kubenswrapper[4883]: I0310 09:36:02.157052 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" podStartSLOduration=1.186254833 podStartE2EDuration="2.157035736s" podCreationTimestamp="2026-03-10 09:36:00 +0000 UTC" firstStartedPulling="2026-03-10 09:36:00.861416858 +0000 UTC m=+1947.116314748" lastFinishedPulling="2026-03-10 09:36:01.832197762 +0000 UTC m=+1948.087095651" observedRunningTime="2026-03-10 09:36:02.146124339 +0000 UTC m=+1948.401022229" watchObservedRunningTime="2026-03-10 09:36:02.157035736 +0000 UTC m=+1948.411933625" Mar 10 09:36:03 crc kubenswrapper[4883]: I0310 09:36:03.145672 4883 generic.go:334] "Generic (PLEG): container finished" podID="c70e8b0b-51ad-4080-8955-8aa8ee68f274" containerID="f3c4bd29935ee27cc39cdc84c3f636e36dc2e8b9c1bfc85b30473304c9e1a2ff" exitCode=0 Mar 10 09:36:03 crc kubenswrapper[4883]: I0310 09:36:03.145771 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" event={"ID":"c70e8b0b-51ad-4080-8955-8aa8ee68f274","Type":"ContainerDied","Data":"f3c4bd29935ee27cc39cdc84c3f636e36dc2e8b9c1bfc85b30473304c9e1a2ff"} Mar 10 09:36:04 crc kubenswrapper[4883]: I0310 09:36:04.449714 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:04 crc kubenswrapper[4883]: I0310 09:36:04.631490 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z5bj\" (UniqueName: \"kubernetes.io/projected/c70e8b0b-51ad-4080-8955-8aa8ee68f274-kube-api-access-6z5bj\") pod \"c70e8b0b-51ad-4080-8955-8aa8ee68f274\" (UID: \"c70e8b0b-51ad-4080-8955-8aa8ee68f274\") " Mar 10 09:36:04 crc kubenswrapper[4883]: I0310 09:36:04.637375 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70e8b0b-51ad-4080-8955-8aa8ee68f274-kube-api-access-6z5bj" (OuterVolumeSpecName: "kube-api-access-6z5bj") pod "c70e8b0b-51ad-4080-8955-8aa8ee68f274" (UID: "c70e8b0b-51ad-4080-8955-8aa8ee68f274"). InnerVolumeSpecName "kube-api-access-6z5bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:36:04 crc kubenswrapper[4883]: I0310 09:36:04.734074 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z5bj\" (UniqueName: \"kubernetes.io/projected/c70e8b0b-51ad-4080-8955-8aa8ee68f274-kube-api-access-6z5bj\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:05 crc kubenswrapper[4883]: I0310 09:36:05.170825 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" event={"ID":"c70e8b0b-51ad-4080-8955-8aa8ee68f274","Type":"ContainerDied","Data":"7f943ddc9f27bb461c68b30cb9b1933997a1a124f599f94f92475364a5e85cb2"} Mar 10 09:36:05 crc kubenswrapper[4883]: I0310 09:36:05.170885 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f943ddc9f27bb461c68b30cb9b1933997a1a124f599f94f92475364a5e85cb2" Mar 10 09:36:05 crc kubenswrapper[4883]: I0310 09:36:05.170891 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:05 crc kubenswrapper[4883]: I0310 09:36:05.217410 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552250-bzz2p"] Mar 10 09:36:05 crc kubenswrapper[4883]: I0310 09:36:05.223425 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552250-bzz2p"] Mar 10 09:36:06 crc kubenswrapper[4883]: I0310 09:36:06.097998 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea7fca8-0ec0-44f9-b729-2c150761519f" path="/var/lib/kubelet/pods/aea7fca8-0ec0-44f9-b729-2c150761519f/volumes" Mar 10 09:36:10 crc kubenswrapper[4883]: I0310 09:36:10.216076 4883 generic.go:334] "Generic (PLEG): container finished" podID="bbcde384-73a5-48c3-a5fb-226d671707cb" containerID="1d006b59011e7940ba2ae5a4022657be65cc4bdf6115e6c741e5f6a8f0eb1eeb" exitCode=0 Mar 10 09:36:10 crc kubenswrapper[4883]: I0310 09:36:10.216191 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" event={"ID":"bbcde384-73a5-48c3-a5fb-226d671707cb","Type":"ContainerDied","Data":"1d006b59011e7940ba2ae5a4022657be65cc4bdf6115e6c741e5f6a8f0eb1eeb"} Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.587242 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.670560 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmfbp\" (UniqueName: \"kubernetes.io/projected/bbcde384-73a5-48c3-a5fb-226d671707cb-kube-api-access-tmfbp\") pod \"bbcde384-73a5-48c3-a5fb-226d671707cb\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.670629 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-inventory\") pod \"bbcde384-73a5-48c3-a5fb-226d671707cb\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.670810 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/bbcde384-73a5-48c3-a5fb-226d671707cb-ovncontroller-config-0\") pod \"bbcde384-73a5-48c3-a5fb-226d671707cb\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.670945 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ovn-combined-ca-bundle\") pod \"bbcde384-73a5-48c3-a5fb-226d671707cb\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.671405 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam\") pod \"bbcde384-73a5-48c3-a5fb-226d671707cb\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.676090 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "bbcde384-73a5-48c3-a5fb-226d671707cb" (UID: "bbcde384-73a5-48c3-a5fb-226d671707cb"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.676267 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbcde384-73a5-48c3-a5fb-226d671707cb-kube-api-access-tmfbp" (OuterVolumeSpecName: "kube-api-access-tmfbp") pod "bbcde384-73a5-48c3-a5fb-226d671707cb" (UID: "bbcde384-73a5-48c3-a5fb-226d671707cb"). InnerVolumeSpecName "kube-api-access-tmfbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:36:11 crc kubenswrapper[4883]: E0310 09:36:11.691557 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam podName:bbcde384-73a5-48c3-a5fb-226d671707cb nodeName:}" failed. No retries permitted until 2026-03-10 09:36:12.191518475 +0000 UTC m=+1958.446416364 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam") pod "bbcde384-73a5-48c3-a5fb-226d671707cb" (UID: "bbcde384-73a5-48c3-a5fb-226d671707cb") : error deleting /var/lib/kubelet/pods/bbcde384-73a5-48c3-a5fb-226d671707cb/volume-subpaths: remove /var/lib/kubelet/pods/bbcde384-73a5-48c3-a5fb-226d671707cb/volume-subpaths: no such file or directory Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.691942 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbcde384-73a5-48c3-a5fb-226d671707cb-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "bbcde384-73a5-48c3-a5fb-226d671707cb" (UID: "bbcde384-73a5-48c3-a5fb-226d671707cb"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.692941 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-inventory" (OuterVolumeSpecName: "inventory") pod "bbcde384-73a5-48c3-a5fb-226d671707cb" (UID: "bbcde384-73a5-48c3-a5fb-226d671707cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.773556 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmfbp\" (UniqueName: \"kubernetes.io/projected/bbcde384-73a5-48c3-a5fb-226d671707cb-kube-api-access-tmfbp\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.773584 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.773596 4883 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/bbcde384-73a5-48c3-a5fb-226d671707cb-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.773605 4883 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.233951 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" event={"ID":"bbcde384-73a5-48c3-a5fb-226d671707cb","Type":"ContainerDied","Data":"27fed5b683575c0a08b954a9ae7d46ae8f20e461778e70a56e2493bbed622009"} Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.234007 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27fed5b683575c0a08b954a9ae7d46ae8f20e461778e70a56e2493bbed622009" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.234013 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.280236 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam\") pod \"bbcde384-73a5-48c3-a5fb-226d671707cb\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.284289 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bbcde384-73a5-48c3-a5fb-226d671707cb" (UID: "bbcde384-73a5-48c3-a5fb-226d671707cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.302550 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4"] Mar 10 09:36:12 crc kubenswrapper[4883]: E0310 09:36:12.302943 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70e8b0b-51ad-4080-8955-8aa8ee68f274" containerName="oc" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.302961 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70e8b0b-51ad-4080-8955-8aa8ee68f274" containerName="oc" Mar 10 09:36:12 crc kubenswrapper[4883]: E0310 09:36:12.302975 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbcde384-73a5-48c3-a5fb-226d671707cb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.302982 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbcde384-73a5-48c3-a5fb-226d671707cb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.303168 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70e8b0b-51ad-4080-8955-8aa8ee68f274" containerName="oc" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.303189 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbcde384-73a5-48c3-a5fb-226d671707cb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.303882 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.305927 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.306341 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.313187 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4"] Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.383867 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.487141 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.487539 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stcch\" (UniqueName: \"kubernetes.io/projected/d37d0afe-ad64-4616-b877-bd05deefd038-kube-api-access-stcch\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.487595 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.487631 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.487702 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.487744 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.589060 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.589179 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.589218 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stcch\" (UniqueName: \"kubernetes.io/projected/d37d0afe-ad64-4616-b877-bd05deefd038-kube-api-access-stcch\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.589262 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.589290 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.589933 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.594863 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.594987 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.596167 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.596709 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.599256 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.605862 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stcch\" (UniqueName: \"kubernetes.io/projected/d37d0afe-ad64-4616-b877-bd05deefd038-kube-api-access-stcch\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.632325 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:13 crc kubenswrapper[4883]: I0310 09:36:13.135504 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4"] Mar 10 09:36:13 crc kubenswrapper[4883]: I0310 09:36:13.245836 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" event={"ID":"d37d0afe-ad64-4616-b877-bd05deefd038","Type":"ContainerStarted","Data":"1447bbc4710c3544c6436b09315bed36a46021e5b37601faa4c7e6f80c6d6f28"} Mar 10 09:36:14 crc kubenswrapper[4883]: I0310 09:36:14.259936 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" event={"ID":"d37d0afe-ad64-4616-b877-bd05deefd038","Type":"ContainerStarted","Data":"5c34c6a621b1787c79c91995c5363b35386fa4a4b8f3dd41526947887890640b"} Mar 10 09:36:14 crc kubenswrapper[4883]: I0310 09:36:14.285468 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" podStartSLOduration=1.7759773540000001 podStartE2EDuration="2.285446886s" podCreationTimestamp="2026-03-10 09:36:12 +0000 UTC" firstStartedPulling="2026-03-10 09:36:13.140961772 +0000 UTC m=+1959.395859651" lastFinishedPulling="2026-03-10 09:36:13.650431293 +0000 UTC m=+1959.905329183" observedRunningTime="2026-03-10 09:36:14.281027982 +0000 UTC m=+1960.535925881" watchObservedRunningTime="2026-03-10 09:36:14.285446886 +0000 UTC m=+1960.540344765" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.480491 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8j4kp"] Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.484156 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.488568 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2j7z\" (UniqueName: \"kubernetes.io/projected/a0d40308-0487-45a0-9ebe-8978ccc4b10b-kube-api-access-d2j7z\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.488639 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-catalog-content\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.488776 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-utilities\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.502920 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j4kp"] Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.594570 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2j7z\" (UniqueName: \"kubernetes.io/projected/a0d40308-0487-45a0-9ebe-8978ccc4b10b-kube-api-access-d2j7z\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.594622 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-catalog-content\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.594689 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-utilities\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.595257 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-catalog-content\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.595269 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-utilities\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.613645 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2j7z\" (UniqueName: \"kubernetes.io/projected/a0d40308-0487-45a0-9ebe-8978ccc4b10b-kube-api-access-d2j7z\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.804207 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:23 crc kubenswrapper[4883]: I0310 09:36:23.206626 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j4kp"] Mar 10 09:36:23 crc kubenswrapper[4883]: I0310 09:36:23.337635 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j4kp" event={"ID":"a0d40308-0487-45a0-9ebe-8978ccc4b10b","Type":"ContainerStarted","Data":"390eed8ba1d2bc57af65e38ba2174fcb259566f3e424f5c3de06515e76c15665"} Mar 10 09:36:24 crc kubenswrapper[4883]: I0310 09:36:24.349711 4883 generic.go:334] "Generic (PLEG): container finished" podID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerID="63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b" exitCode=0 Mar 10 09:36:24 crc kubenswrapper[4883]: I0310 09:36:24.349911 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j4kp" event={"ID":"a0d40308-0487-45a0-9ebe-8978ccc4b10b","Type":"ContainerDied","Data":"63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b"} Mar 10 09:36:25 crc kubenswrapper[4883]: I0310 09:36:25.360833 4883 generic.go:334] "Generic (PLEG): container finished" podID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerID="78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8" exitCode=0 Mar 10 09:36:25 crc kubenswrapper[4883]: I0310 09:36:25.360925 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j4kp" event={"ID":"a0d40308-0487-45a0-9ebe-8978ccc4b10b","Type":"ContainerDied","Data":"78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8"} Mar 10 09:36:26 crc kubenswrapper[4883]: I0310 09:36:26.373431 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j4kp" event={"ID":"a0d40308-0487-45a0-9ebe-8978ccc4b10b","Type":"ContainerStarted","Data":"6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2"} Mar 10 09:36:26 crc kubenswrapper[4883]: I0310 09:36:26.393631 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8j4kp" podStartSLOduration=2.91504723 podStartE2EDuration="4.393607076s" podCreationTimestamp="2026-03-10 09:36:22 +0000 UTC" firstStartedPulling="2026-03-10 09:36:24.353071568 +0000 UTC m=+1970.607969458" lastFinishedPulling="2026-03-10 09:36:25.831631416 +0000 UTC m=+1972.086529304" observedRunningTime="2026-03-10 09:36:26.388971944 +0000 UTC m=+1972.643869834" watchObservedRunningTime="2026-03-10 09:36:26.393607076 +0000 UTC m=+1972.648504965" Mar 10 09:36:32 crc kubenswrapper[4883]: I0310 09:36:32.804857 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:32 crc kubenswrapper[4883]: I0310 09:36:32.805169 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:32 crc kubenswrapper[4883]: I0310 09:36:32.841331 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:33 crc kubenswrapper[4883]: I0310 09:36:33.469586 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:33 crc kubenswrapper[4883]: I0310 09:36:33.519232 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j4kp"] Mar 10 09:36:35 crc kubenswrapper[4883]: I0310 09:36:35.452887 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8j4kp" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="registry-server" containerID="cri-o://6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2" gracePeriod=2 Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.310379 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.372870 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-utilities\") pod \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.372974 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2j7z\" (UniqueName: \"kubernetes.io/projected/a0d40308-0487-45a0-9ebe-8978ccc4b10b-kube-api-access-d2j7z\") pod \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.373089 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-catalog-content\") pod \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.373884 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-utilities" (OuterVolumeSpecName: "utilities") pod "a0d40308-0487-45a0-9ebe-8978ccc4b10b" (UID: "a0d40308-0487-45a0-9ebe-8978ccc4b10b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.378692 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d40308-0487-45a0-9ebe-8978ccc4b10b-kube-api-access-d2j7z" (OuterVolumeSpecName: "kube-api-access-d2j7z") pod "a0d40308-0487-45a0-9ebe-8978ccc4b10b" (UID: "a0d40308-0487-45a0-9ebe-8978ccc4b10b"). InnerVolumeSpecName "kube-api-access-d2j7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.393788 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0d40308-0487-45a0-9ebe-8978ccc4b10b" (UID: "a0d40308-0487-45a0-9ebe-8978ccc4b10b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.466136 4883 generic.go:334] "Generic (PLEG): container finished" podID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerID="6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2" exitCode=0 Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.466209 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j4kp" event={"ID":"a0d40308-0487-45a0-9ebe-8978ccc4b10b","Type":"ContainerDied","Data":"6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2"} Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.466228 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.466258 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j4kp" event={"ID":"a0d40308-0487-45a0-9ebe-8978ccc4b10b","Type":"ContainerDied","Data":"390eed8ba1d2bc57af65e38ba2174fcb259566f3e424f5c3de06515e76c15665"} Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.466284 4883 scope.go:117] "RemoveContainer" containerID="6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.476747 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2j7z\" (UniqueName: \"kubernetes.io/projected/a0d40308-0487-45a0-9ebe-8978ccc4b10b-kube-api-access-d2j7z\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.476774 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.476810 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.492566 4883 scope.go:117] "RemoveContainer" containerID="78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.495334 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j4kp"] Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.501461 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j4kp"] Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.515719 4883 scope.go:117] "RemoveContainer" containerID="63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.546617 4883 scope.go:117] "RemoveContainer" containerID="6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2" Mar 10 09:36:36 crc kubenswrapper[4883]: E0310 09:36:36.547072 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2\": container with ID starting with 6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2 not found: ID does not exist" containerID="6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.547107 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2"} err="failed to get container status \"6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2\": rpc error: code = NotFound desc = could not find container \"6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2\": container with ID starting with 6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2 not found: ID does not exist" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.547132 4883 scope.go:117] "RemoveContainer" containerID="78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8" Mar 10 09:36:36 crc kubenswrapper[4883]: E0310 09:36:36.547427 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8\": container with ID starting with 78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8 not found: ID does not exist" containerID="78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.547468 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8"} err="failed to get container status \"78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8\": rpc error: code = NotFound desc = could not find container \"78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8\": container with ID starting with 78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8 not found: ID does not exist" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.547517 4883 scope.go:117] "RemoveContainer" containerID="63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b" Mar 10 09:36:36 crc kubenswrapper[4883]: E0310 09:36:36.547860 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b\": container with ID starting with 63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b not found: ID does not exist" containerID="63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.547884 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b"} err="failed to get container status \"63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b\": rpc error: code = NotFound desc = could not find container \"63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b\": container with ID starting with 63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b not found: ID does not exist" Mar 10 09:36:38 crc kubenswrapper[4883]: I0310 09:36:38.088898 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" path="/var/lib/kubelet/pods/a0d40308-0487-45a0-9ebe-8978ccc4b10b/volumes" Mar 10 09:36:49 crc kubenswrapper[4883]: I0310 09:36:49.591519 4883 generic.go:334] "Generic (PLEG): container finished" podID="d37d0afe-ad64-4616-b877-bd05deefd038" containerID="5c34c6a621b1787c79c91995c5363b35386fa4a4b8f3dd41526947887890640b" exitCode=0 Mar 10 09:36:49 crc kubenswrapper[4883]: I0310 09:36:49.591589 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" event={"ID":"d37d0afe-ad64-4616-b877-bd05deefd038","Type":"ContainerDied","Data":"5c34c6a621b1787c79c91995c5363b35386fa4a4b8f3dd41526947887890640b"} Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.948292 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.953637 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stcch\" (UniqueName: \"kubernetes.io/projected/d37d0afe-ad64-4616-b877-bd05deefd038-kube-api-access-stcch\") pod \"d37d0afe-ad64-4616-b877-bd05deefd038\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.953702 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-nova-metadata-neutron-config-0\") pod \"d37d0afe-ad64-4616-b877-bd05deefd038\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.953787 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-ovn-metadata-agent-neutron-config-0\") pod \"d37d0afe-ad64-4616-b877-bd05deefd038\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.953861 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-ssh-key-openstack-edpm-ipam\") pod \"d37d0afe-ad64-4616-b877-bd05deefd038\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.953912 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-inventory\") pod \"d37d0afe-ad64-4616-b877-bd05deefd038\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.953990 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-metadata-combined-ca-bundle\") pod \"d37d0afe-ad64-4616-b877-bd05deefd038\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.960655 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d37d0afe-ad64-4616-b877-bd05deefd038" (UID: "d37d0afe-ad64-4616-b877-bd05deefd038"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.960660 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d37d0afe-ad64-4616-b877-bd05deefd038-kube-api-access-stcch" (OuterVolumeSpecName: "kube-api-access-stcch") pod "d37d0afe-ad64-4616-b877-bd05deefd038" (UID: "d37d0afe-ad64-4616-b877-bd05deefd038"). InnerVolumeSpecName "kube-api-access-stcch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.980192 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-inventory" (OuterVolumeSpecName: "inventory") pod "d37d0afe-ad64-4616-b877-bd05deefd038" (UID: "d37d0afe-ad64-4616-b877-bd05deefd038"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.984469 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "d37d0afe-ad64-4616-b877-bd05deefd038" (UID: "d37d0afe-ad64-4616-b877-bd05deefd038"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.986420 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d37d0afe-ad64-4616-b877-bd05deefd038" (UID: "d37d0afe-ad64-4616-b877-bd05deefd038"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.986964 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "d37d0afe-ad64-4616-b877-bd05deefd038" (UID: "d37d0afe-ad64-4616-b877-bd05deefd038"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.055993 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stcch\" (UniqueName: \"kubernetes.io/projected/d37d0afe-ad64-4616-b877-bd05deefd038-kube-api-access-stcch\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.056056 4883 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.056071 4883 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.056084 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.056098 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.056108 4883 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.609030 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" event={"ID":"d37d0afe-ad64-4616-b877-bd05deefd038","Type":"ContainerDied","Data":"1447bbc4710c3544c6436b09315bed36a46021e5b37601faa4c7e6f80c6d6f28"} Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.609088 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1447bbc4710c3544c6436b09315bed36a46021e5b37601faa4c7e6f80c6d6f28" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.609407 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.709719 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw"] Mar 10 09:36:51 crc kubenswrapper[4883]: E0310 09:36:51.710420 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="extract-utilities" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.710546 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="extract-utilities" Mar 10 09:36:51 crc kubenswrapper[4883]: E0310 09:36:51.710640 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d37d0afe-ad64-4616-b877-bd05deefd038" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.710706 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d37d0afe-ad64-4616-b877-bd05deefd038" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 09:36:51 crc kubenswrapper[4883]: E0310 09:36:51.710774 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="registry-server" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.710833 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="registry-server" Mar 10 09:36:51 crc kubenswrapper[4883]: E0310 09:36:51.710911 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="extract-content" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.710973 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="extract-content" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.711221 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d37d0afe-ad64-4616-b877-bd05deefd038" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.711285 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="registry-server" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.712175 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.714550 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.715341 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.715416 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.715340 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.715600 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.717393 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw"] Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.768201 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.768400 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.768561 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.768665 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.768815 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6mcn\" (UniqueName: \"kubernetes.io/projected/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-kube-api-access-m6mcn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.870314 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.870634 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.870782 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6mcn\" (UniqueName: \"kubernetes.io/projected/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-kube-api-access-m6mcn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.870964 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.871070 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.875277 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.875502 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.875961 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.876340 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.885208 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6mcn\" (UniqueName: \"kubernetes.io/projected/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-kube-api-access-m6mcn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:52 crc kubenswrapper[4883]: I0310 09:36:52.029785 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:52 crc kubenswrapper[4883]: I0310 09:36:52.473312 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw"] Mar 10 09:36:52 crc kubenswrapper[4883]: I0310 09:36:52.618176 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" event={"ID":"eb3b72a2-945a-4719-87c0-ffaf7eb84b52","Type":"ContainerStarted","Data":"5132af8adf7ec0c2f156b654b7c9255b712406e2697f68910d1bb75a882c3793"} Mar 10 09:36:53 crc kubenswrapper[4883]: I0310 09:36:53.626518 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" event={"ID":"eb3b72a2-945a-4719-87c0-ffaf7eb84b52","Type":"ContainerStarted","Data":"ee183e9dfde8ff8beaedf0c3401eda98426713aedff5f916f72c140e022aa4c6"} Mar 10 09:36:53 crc kubenswrapper[4883]: I0310 09:36:53.652922 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" podStartSLOduration=2.155237998 podStartE2EDuration="2.652901696s" podCreationTimestamp="2026-03-10 09:36:51 +0000 UTC" firstStartedPulling="2026-03-10 09:36:52.481106129 +0000 UTC m=+1998.736004018" lastFinishedPulling="2026-03-10 09:36:52.978769827 +0000 UTC m=+1999.233667716" observedRunningTime="2026-03-10 09:36:53.640018139 +0000 UTC m=+1999.894916028" watchObservedRunningTime="2026-03-10 09:36:53.652901696 +0000 UTC m=+1999.907799585" Mar 10 09:37:05 crc kubenswrapper[4883]: I0310 09:37:05.673019 4883 scope.go:117] "RemoveContainer" containerID="55b96e78b0c842f2c57bd194eb8366216f87ea2f29ff4952bc2e23513be83089" Mar 10 09:37:17 crc kubenswrapper[4883]: I0310 09:37:17.448954 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:37:17 crc kubenswrapper[4883]: I0310 09:37:17.449542 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:37:47 crc kubenswrapper[4883]: I0310 09:37:47.449197 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:37:47 crc kubenswrapper[4883]: I0310 09:37:47.449741 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.137094 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552258-6pf25"] Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.139124 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.141465 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.141500 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.143706 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552258-6pf25"] Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.143801 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.288157 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl5zc\" (UniqueName: \"kubernetes.io/projected/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5-kube-api-access-cl5zc\") pod \"auto-csr-approver-29552258-6pf25\" (UID: \"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5\") " pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.390007 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl5zc\" (UniqueName: \"kubernetes.io/projected/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5-kube-api-access-cl5zc\") pod \"auto-csr-approver-29552258-6pf25\" (UID: \"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5\") " pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.408306 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl5zc\" (UniqueName: \"kubernetes.io/projected/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5-kube-api-access-cl5zc\") pod \"auto-csr-approver-29552258-6pf25\" (UID: \"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5\") " pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.454409 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.863771 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552258-6pf25"] Mar 10 09:38:00 crc kubenswrapper[4883]: W0310 09:38:00.873250 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ff28fc4_3134_48ba_9697_b74e9b4e6ec5.slice/crio-ac4cef9d0fe3f52a260595bdab7a0c36bd45dabcab08e3909ee82747ad271dc5 WatchSource:0}: Error finding container ac4cef9d0fe3f52a260595bdab7a0c36bd45dabcab08e3909ee82747ad271dc5: Status 404 returned error can't find the container with id ac4cef9d0fe3f52a260595bdab7a0c36bd45dabcab08e3909ee82747ad271dc5 Mar 10 09:38:01 crc kubenswrapper[4883]: I0310 09:38:01.221444 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552258-6pf25" event={"ID":"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5","Type":"ContainerStarted","Data":"ac4cef9d0fe3f52a260595bdab7a0c36bd45dabcab08e3909ee82747ad271dc5"} Mar 10 09:38:03 crc kubenswrapper[4883]: I0310 09:38:03.256886 4883 generic.go:334] "Generic (PLEG): container finished" podID="2ff28fc4-3134-48ba-9697-b74e9b4e6ec5" containerID="0172a22ca1b2674eaf7b22f058338bb6d8a1a070eb6300e099b6179e3eec55d7" exitCode=0 Mar 10 09:38:03 crc kubenswrapper[4883]: I0310 09:38:03.257347 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552258-6pf25" event={"ID":"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5","Type":"ContainerDied","Data":"0172a22ca1b2674eaf7b22f058338bb6d8a1a070eb6300e099b6179e3eec55d7"} Mar 10 09:38:04 crc kubenswrapper[4883]: I0310 09:38:04.555116 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:04 crc kubenswrapper[4883]: I0310 09:38:04.684023 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl5zc\" (UniqueName: \"kubernetes.io/projected/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5-kube-api-access-cl5zc\") pod \"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5\" (UID: \"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5\") " Mar 10 09:38:04 crc kubenswrapper[4883]: I0310 09:38:04.689988 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5-kube-api-access-cl5zc" (OuterVolumeSpecName: "kube-api-access-cl5zc") pod "2ff28fc4-3134-48ba-9697-b74e9b4e6ec5" (UID: "2ff28fc4-3134-48ba-9697-b74e9b4e6ec5"). InnerVolumeSpecName "kube-api-access-cl5zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:38:04 crc kubenswrapper[4883]: I0310 09:38:04.787461 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl5zc\" (UniqueName: \"kubernetes.io/projected/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5-kube-api-access-cl5zc\") on node \"crc\" DevicePath \"\"" Mar 10 09:38:05 crc kubenswrapper[4883]: I0310 09:38:05.289146 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552258-6pf25" event={"ID":"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5","Type":"ContainerDied","Data":"ac4cef9d0fe3f52a260595bdab7a0c36bd45dabcab08e3909ee82747ad271dc5"} Mar 10 09:38:05 crc kubenswrapper[4883]: I0310 09:38:05.289207 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac4cef9d0fe3f52a260595bdab7a0c36bd45dabcab08e3909ee82747ad271dc5" Mar 10 09:38:05 crc kubenswrapper[4883]: I0310 09:38:05.289271 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:05 crc kubenswrapper[4883]: I0310 09:38:05.613625 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552252-vcsb9"] Mar 10 09:38:05 crc kubenswrapper[4883]: I0310 09:38:05.620594 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552252-vcsb9"] Mar 10 09:38:06 crc kubenswrapper[4883]: I0310 09:38:06.091638 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc74aa89-09d6-4974-a6c1-1642f6ef0a64" path="/var/lib/kubelet/pods/fc74aa89-09d6-4974-a6c1-1642f6ef0a64/volumes" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.043277 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f5q5h"] Mar 10 09:38:17 crc kubenswrapper[4883]: E0310 09:38:17.044875 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff28fc4-3134-48ba-9697-b74e9b4e6ec5" containerName="oc" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.044896 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff28fc4-3134-48ba-9697-b74e9b4e6ec5" containerName="oc" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.045230 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff28fc4-3134-48ba-9697-b74e9b4e6ec5" containerName="oc" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.047595 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.063228 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f5q5h"] Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.130427 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-utilities\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.130504 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkn7w\" (UniqueName: \"kubernetes.io/projected/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-kube-api-access-kkn7w\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.130589 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-catalog-content\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.232312 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-utilities\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.232639 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkn7w\" (UniqueName: \"kubernetes.io/projected/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-kube-api-access-kkn7w\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.232803 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-catalog-content\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.232834 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-utilities\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.233046 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-catalog-content\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.256044 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkn7w\" (UniqueName: \"kubernetes.io/projected/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-kube-api-access-kkn7w\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.372887 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.449216 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.449275 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.449331 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.450075 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c1139e014fa94ac8c68493a0442f3c82dbfbb40d973771c846961471e7e8c0d"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.450148 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://5c1139e014fa94ac8c68493a0442f3c82dbfbb40d973771c846961471e7e8c0d" gracePeriod=600 Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.871607 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f5q5h"] Mar 10 09:38:17 crc kubenswrapper[4883]: W0310 09:38:17.872231 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd770ddb2_bbf8_4a33_9a98_dddf894c2c86.slice/crio-2f35b42a530fc3ca7d1430f3f39e8e97df8f3699f0a3f6403991bfd53d588470 WatchSource:0}: Error finding container 2f35b42a530fc3ca7d1430f3f39e8e97df8f3699f0a3f6403991bfd53d588470: Status 404 returned error can't find the container with id 2f35b42a530fc3ca7d1430f3f39e8e97df8f3699f0a3f6403991bfd53d588470 Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.412172 4883 generic.go:334] "Generic (PLEG): container finished" podID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerID="95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98" exitCode=0 Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.412456 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerDied","Data":"95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98"} Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.412867 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerStarted","Data":"2f35b42a530fc3ca7d1430f3f39e8e97df8f3699f0a3f6403991bfd53d588470"} Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.416579 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="5c1139e014fa94ac8c68493a0442f3c82dbfbb40d973771c846961471e7e8c0d" exitCode=0 Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.416608 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"5c1139e014fa94ac8c68493a0442f3c82dbfbb40d973771c846961471e7e8c0d"} Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.416624 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b"} Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.416642 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:38:19 crc kubenswrapper[4883]: I0310 09:38:19.426327 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerStarted","Data":"00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73"} Mar 10 09:38:20 crc kubenswrapper[4883]: I0310 09:38:20.443622 4883 generic.go:334] "Generic (PLEG): container finished" podID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerID="00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73" exitCode=0 Mar 10 09:38:20 crc kubenswrapper[4883]: I0310 09:38:20.443731 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerDied","Data":"00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73"} Mar 10 09:38:21 crc kubenswrapper[4883]: I0310 09:38:21.457393 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerStarted","Data":"06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994"} Mar 10 09:38:21 crc kubenswrapper[4883]: I0310 09:38:21.478728 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f5q5h" podStartSLOduration=1.97855361 podStartE2EDuration="4.478697065s" podCreationTimestamp="2026-03-10 09:38:17 +0000 UTC" firstStartedPulling="2026-03-10 09:38:18.41412307 +0000 UTC m=+2084.669020959" lastFinishedPulling="2026-03-10 09:38:20.914266525 +0000 UTC m=+2087.169164414" observedRunningTime="2026-03-10 09:38:21.473236698 +0000 UTC m=+2087.728134587" watchObservedRunningTime="2026-03-10 09:38:21.478697065 +0000 UTC m=+2087.733594954" Mar 10 09:38:27 crc kubenswrapper[4883]: I0310 09:38:27.373261 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:27 crc kubenswrapper[4883]: I0310 09:38:27.373700 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:27 crc kubenswrapper[4883]: I0310 09:38:27.417778 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:27 crc kubenswrapper[4883]: I0310 09:38:27.555605 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:27 crc kubenswrapper[4883]: I0310 09:38:27.650988 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f5q5h"] Mar 10 09:38:29 crc kubenswrapper[4883]: I0310 09:38:29.529188 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f5q5h" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="registry-server" containerID="cri-o://06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994" gracePeriod=2 Mar 10 09:38:29 crc kubenswrapper[4883]: I0310 09:38:29.924566 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.099563 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkn7w\" (UniqueName: \"kubernetes.io/projected/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-kube-api-access-kkn7w\") pod \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.099688 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-utilities\") pod \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.099853 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-catalog-content\") pod \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.100577 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-utilities" (OuterVolumeSpecName: "utilities") pod "d770ddb2-bbf8-4a33-9a98-dddf894c2c86" (UID: "d770ddb2-bbf8-4a33-9a98-dddf894c2c86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.107061 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-kube-api-access-kkn7w" (OuterVolumeSpecName: "kube-api-access-kkn7w") pod "d770ddb2-bbf8-4a33-9a98-dddf894c2c86" (UID: "d770ddb2-bbf8-4a33-9a98-dddf894c2c86"). InnerVolumeSpecName "kube-api-access-kkn7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.145856 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d770ddb2-bbf8-4a33-9a98-dddf894c2c86" (UID: "d770ddb2-bbf8-4a33-9a98-dddf894c2c86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.204013 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkn7w\" (UniqueName: \"kubernetes.io/projected/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-kube-api-access-kkn7w\") on node \"crc\" DevicePath \"\"" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.204059 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.204070 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.542848 4883 generic.go:334] "Generic (PLEG): container finished" podID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerID="06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994" exitCode=0 Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.542945 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.542961 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerDied","Data":"06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994"} Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.543342 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerDied","Data":"2f35b42a530fc3ca7d1430f3f39e8e97df8f3699f0a3f6403991bfd53d588470"} Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.543369 4883 scope.go:117] "RemoveContainer" containerID="06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.571838 4883 scope.go:117] "RemoveContainer" containerID="00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.578365 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f5q5h"] Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.598633 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f5q5h"] Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.598886 4883 scope.go:117] "RemoveContainer" containerID="95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.632322 4883 scope.go:117] "RemoveContainer" containerID="06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994" Mar 10 09:38:30 crc kubenswrapper[4883]: E0310 09:38:30.632811 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994\": container with ID starting with 06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994 not found: ID does not exist" containerID="06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.632856 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994"} err="failed to get container status \"06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994\": rpc error: code = NotFound desc = could not find container \"06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994\": container with ID starting with 06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994 not found: ID does not exist" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.632888 4883 scope.go:117] "RemoveContainer" containerID="00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73" Mar 10 09:38:30 crc kubenswrapper[4883]: E0310 09:38:30.633532 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73\": container with ID starting with 00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73 not found: ID does not exist" containerID="00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.633561 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73"} err="failed to get container status \"00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73\": rpc error: code = NotFound desc = could not find container \"00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73\": container with ID starting with 00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73 not found: ID does not exist" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.633576 4883 scope.go:117] "RemoveContainer" containerID="95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98" Mar 10 09:38:30 crc kubenswrapper[4883]: E0310 09:38:30.633837 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98\": container with ID starting with 95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98 not found: ID does not exist" containerID="95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.633873 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98"} err="failed to get container status \"95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98\": rpc error: code = NotFound desc = could not find container \"95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98\": container with ID starting with 95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98 not found: ID does not exist" Mar 10 09:38:32 crc kubenswrapper[4883]: I0310 09:38:32.089978 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" path="/var/lib/kubelet/pods/d770ddb2-bbf8-4a33-9a98-dddf894c2c86/volumes" Mar 10 09:39:05 crc kubenswrapper[4883]: I0310 09:39:05.786228 4883 scope.go:117] "RemoveContainer" containerID="76a36df1ff76227c193949f769a79a8229f0c35af6ce9046d5c6bb133c432611" Mar 10 09:39:49 crc kubenswrapper[4883]: I0310 09:39:49.259942 4883 generic.go:334] "Generic (PLEG): container finished" podID="eb3b72a2-945a-4719-87c0-ffaf7eb84b52" containerID="ee183e9dfde8ff8beaedf0c3401eda98426713aedff5f916f72c140e022aa4c6" exitCode=0 Mar 10 09:39:49 crc kubenswrapper[4883]: I0310 09:39:49.260042 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" event={"ID":"eb3b72a2-945a-4719-87c0-ffaf7eb84b52","Type":"ContainerDied","Data":"ee183e9dfde8ff8beaedf0c3401eda98426713aedff5f916f72c140e022aa4c6"} Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.577365 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.681017 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-inventory\") pod \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.681087 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-secret-0\") pod \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.681125 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-ssh-key-openstack-edpm-ipam\") pod \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.681165 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-combined-ca-bundle\") pod \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.681284 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6mcn\" (UniqueName: \"kubernetes.io/projected/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-kube-api-access-m6mcn\") pod \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.687260 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-kube-api-access-m6mcn" (OuterVolumeSpecName: "kube-api-access-m6mcn") pod "eb3b72a2-945a-4719-87c0-ffaf7eb84b52" (UID: "eb3b72a2-945a-4719-87c0-ffaf7eb84b52"). InnerVolumeSpecName "kube-api-access-m6mcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.687811 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "eb3b72a2-945a-4719-87c0-ffaf7eb84b52" (UID: "eb3b72a2-945a-4719-87c0-ffaf7eb84b52"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.705530 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "eb3b72a2-945a-4719-87c0-ffaf7eb84b52" (UID: "eb3b72a2-945a-4719-87c0-ffaf7eb84b52"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.705542 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-inventory" (OuterVolumeSpecName: "inventory") pod "eb3b72a2-945a-4719-87c0-ffaf7eb84b52" (UID: "eb3b72a2-945a-4719-87c0-ffaf7eb84b52"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.705889 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eb3b72a2-945a-4719-87c0-ffaf7eb84b52" (UID: "eb3b72a2-945a-4719-87c0-ffaf7eb84b52"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.783156 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6mcn\" (UniqueName: \"kubernetes.io/projected/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-kube-api-access-m6mcn\") on node \"crc\" DevicePath \"\"" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.783186 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.783197 4883 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.783205 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.783213 4883 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.275126 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" event={"ID":"eb3b72a2-945a-4719-87c0-ffaf7eb84b52","Type":"ContainerDied","Data":"5132af8adf7ec0c2f156b654b7c9255b712406e2697f68910d1bb75a882c3793"} Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.275173 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5132af8adf7ec0c2f156b654b7c9255b712406e2697f68910d1bb75a882c3793" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.275183 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.351525 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf"] Mar 10 09:39:51 crc kubenswrapper[4883]: E0310 09:39:51.352134 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="registry-server" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.352152 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="registry-server" Mar 10 09:39:51 crc kubenswrapper[4883]: E0310 09:39:51.352178 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="extract-content" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.352184 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="extract-content" Mar 10 09:39:51 crc kubenswrapper[4883]: E0310 09:39:51.352205 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb3b72a2-945a-4719-87c0-ffaf7eb84b52" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.352212 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb3b72a2-945a-4719-87c0-ffaf7eb84b52" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 09:39:51 crc kubenswrapper[4883]: E0310 09:39:51.352227 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="extract-utilities" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.352233 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="extract-utilities" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.352400 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="registry-server" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.352415 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb3b72a2-945a-4719-87c0-ffaf7eb84b52" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.353025 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.355342 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.355531 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.355709 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.355831 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.355966 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.356084 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.356211 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.362288 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf"] Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.396862 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.396908 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.396935 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd8p2\" (UniqueName: \"kubernetes.io/projected/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-kube-api-access-jd8p2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397052 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397126 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397199 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397268 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397307 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397374 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397404 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397427 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.499345 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.499425 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.499512 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.499538 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.499782 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.500425 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.500508 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.500781 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.501232 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.501288 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.501315 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd8p2\" (UniqueName: \"kubernetes.io/projected/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-kube-api-access-jd8p2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.501376 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.506486 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.506660 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.506892 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.507135 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.507380 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.507946 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.514275 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.515255 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.516900 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.519326 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd8p2\" (UniqueName: \"kubernetes.io/projected/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-kube-api-access-jd8p2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.665396 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:52 crc kubenswrapper[4883]: I0310 09:39:52.134832 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf"] Mar 10 09:39:52 crc kubenswrapper[4883]: I0310 09:39:52.283441 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" event={"ID":"af134b73-8c24-4b9e-b15e-48ff4b83ecd4","Type":"ContainerStarted","Data":"3f29108c5dad1a5578dac32057489e322fee1b450f1de55af05d05420ce128d5"} Mar 10 09:39:53 crc kubenswrapper[4883]: I0310 09:39:53.294570 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" event={"ID":"af134b73-8c24-4b9e-b15e-48ff4b83ecd4","Type":"ContainerStarted","Data":"4951ccf796499d6b47c904d8481fa70523d42eff658bee0de028c1741f6614b5"} Mar 10 09:39:53 crc kubenswrapper[4883]: I0310 09:39:53.317549 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" podStartSLOduration=1.796281427 podStartE2EDuration="2.317527158s" podCreationTimestamp="2026-03-10 09:39:51 +0000 UTC" firstStartedPulling="2026-03-10 09:39:52.138672604 +0000 UTC m=+2178.393570493" lastFinishedPulling="2026-03-10 09:39:52.659918335 +0000 UTC m=+2178.914816224" observedRunningTime="2026-03-10 09:39:53.311018793 +0000 UTC m=+2179.565916682" watchObservedRunningTime="2026-03-10 09:39:53.317527158 +0000 UTC m=+2179.572425047" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.142086 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552260-m5d6m"] Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.143778 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.145297 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.145780 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.146117 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.156490 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552260-m5d6m"] Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.216994 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8dlr\" (UniqueName: \"kubernetes.io/projected/9c22b88a-ce63-4c4a-a606-17c563e9e156-kube-api-access-n8dlr\") pod \"auto-csr-approver-29552260-m5d6m\" (UID: \"9c22b88a-ce63-4c4a-a606-17c563e9e156\") " pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.318006 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8dlr\" (UniqueName: \"kubernetes.io/projected/9c22b88a-ce63-4c4a-a606-17c563e9e156-kube-api-access-n8dlr\") pod \"auto-csr-approver-29552260-m5d6m\" (UID: \"9c22b88a-ce63-4c4a-a606-17c563e9e156\") " pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.335535 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8dlr\" (UniqueName: \"kubernetes.io/projected/9c22b88a-ce63-4c4a-a606-17c563e9e156-kube-api-access-n8dlr\") pod \"auto-csr-approver-29552260-m5d6m\" (UID: \"9c22b88a-ce63-4c4a-a606-17c563e9e156\") " pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.461876 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.892224 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552260-m5d6m"] Mar 10 09:40:01 crc kubenswrapper[4883]: I0310 09:40:01.363875 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" event={"ID":"9c22b88a-ce63-4c4a-a606-17c563e9e156","Type":"ContainerStarted","Data":"9b1442d84ebc314d87a87c497d41a78e35b6dda32220bfadc36e5e2d1a796ff3"} Mar 10 09:40:03 crc kubenswrapper[4883]: I0310 09:40:03.383416 4883 generic.go:334] "Generic (PLEG): container finished" podID="9c22b88a-ce63-4c4a-a606-17c563e9e156" containerID="43ec4b59a40b1bfb044dd60703dd00f3cac266b590c5664aa4f0403a553e3872" exitCode=0 Mar 10 09:40:03 crc kubenswrapper[4883]: I0310 09:40:03.383517 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" event={"ID":"9c22b88a-ce63-4c4a-a606-17c563e9e156","Type":"ContainerDied","Data":"43ec4b59a40b1bfb044dd60703dd00f3cac266b590c5664aa4f0403a553e3872"} Mar 10 09:40:04 crc kubenswrapper[4883]: I0310 09:40:04.723046 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:04 crc kubenswrapper[4883]: I0310 09:40:04.912622 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8dlr\" (UniqueName: \"kubernetes.io/projected/9c22b88a-ce63-4c4a-a606-17c563e9e156-kube-api-access-n8dlr\") pod \"9c22b88a-ce63-4c4a-a606-17c563e9e156\" (UID: \"9c22b88a-ce63-4c4a-a606-17c563e9e156\") " Mar 10 09:40:04 crc kubenswrapper[4883]: I0310 09:40:04.919939 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c22b88a-ce63-4c4a-a606-17c563e9e156-kube-api-access-n8dlr" (OuterVolumeSpecName: "kube-api-access-n8dlr") pod "9c22b88a-ce63-4c4a-a606-17c563e9e156" (UID: "9c22b88a-ce63-4c4a-a606-17c563e9e156"). InnerVolumeSpecName "kube-api-access-n8dlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:40:05 crc kubenswrapper[4883]: I0310 09:40:05.014387 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8dlr\" (UniqueName: \"kubernetes.io/projected/9c22b88a-ce63-4c4a-a606-17c563e9e156-kube-api-access-n8dlr\") on node \"crc\" DevicePath \"\"" Mar 10 09:40:05 crc kubenswrapper[4883]: I0310 09:40:05.402938 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" event={"ID":"9c22b88a-ce63-4c4a-a606-17c563e9e156","Type":"ContainerDied","Data":"9b1442d84ebc314d87a87c497d41a78e35b6dda32220bfadc36e5e2d1a796ff3"} Mar 10 09:40:05 crc kubenswrapper[4883]: I0310 09:40:05.402988 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b1442d84ebc314d87a87c497d41a78e35b6dda32220bfadc36e5e2d1a796ff3" Mar 10 09:40:05 crc kubenswrapper[4883]: I0310 09:40:05.403000 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:05 crc kubenswrapper[4883]: I0310 09:40:05.779130 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552254-d9q7p"] Mar 10 09:40:05 crc kubenswrapper[4883]: I0310 09:40:05.784110 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552254-d9q7p"] Mar 10 09:40:06 crc kubenswrapper[4883]: I0310 09:40:06.090720 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b477b90a-75af-4621-8c33-21fdd8c9c749" path="/var/lib/kubelet/pods/b477b90a-75af-4621-8c33-21fdd8c9c749/volumes" Mar 10 09:40:17 crc kubenswrapper[4883]: I0310 09:40:17.449106 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:40:17 crc kubenswrapper[4883]: I0310 09:40:17.449826 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:40:47 crc kubenswrapper[4883]: I0310 09:40:47.449312 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:40:47 crc kubenswrapper[4883]: I0310 09:40:47.450111 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:41:05 crc kubenswrapper[4883]: I0310 09:41:05.893658 4883 scope.go:117] "RemoveContainer" containerID="6864877f7abf0513eaa87f372fd6fb5c7baab57f240f7c0cd19def879aaf0dc8" Mar 10 09:41:17 crc kubenswrapper[4883]: I0310 09:41:17.448918 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:41:17 crc kubenswrapper[4883]: I0310 09:41:17.449616 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:41:17 crc kubenswrapper[4883]: I0310 09:41:17.449672 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:41:17 crc kubenswrapper[4883]: I0310 09:41:17.450315 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:41:17 crc kubenswrapper[4883]: I0310 09:41:17.450393 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" gracePeriod=600 Mar 10 09:41:17 crc kubenswrapper[4883]: E0310 09:41:17.578442 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:41:18 crc kubenswrapper[4883]: I0310 09:41:18.046205 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" exitCode=0 Mar 10 09:41:18 crc kubenswrapper[4883]: I0310 09:41:18.046265 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b"} Mar 10 09:41:18 crc kubenswrapper[4883]: I0310 09:41:18.046864 4883 scope.go:117] "RemoveContainer" containerID="5c1139e014fa94ac8c68493a0442f3c82dbfbb40d973771c846961471e7e8c0d" Mar 10 09:41:18 crc kubenswrapper[4883]: I0310 09:41:18.047650 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:41:18 crc kubenswrapper[4883]: E0310 09:41:18.048043 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:41:33 crc kubenswrapper[4883]: I0310 09:41:33.080383 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:41:33 crc kubenswrapper[4883]: E0310 09:41:33.081322 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.467181 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-df97s"] Mar 10 09:41:35 crc kubenswrapper[4883]: E0310 09:41:35.467929 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c22b88a-ce63-4c4a-a606-17c563e9e156" containerName="oc" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.467944 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c22b88a-ce63-4c4a-a606-17c563e9e156" containerName="oc" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.468127 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c22b88a-ce63-4c4a-a606-17c563e9e156" containerName="oc" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.469423 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.473977 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-df97s"] Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.573886 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-utilities\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.573931 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-catalog-content\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.574660 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndklf\" (UniqueName: \"kubernetes.io/projected/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-kube-api-access-ndklf\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.659505 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2s8h8"] Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.661256 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.673804 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2s8h8"] Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.676238 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-utilities\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.676275 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-catalog-content\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.676578 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndklf\" (UniqueName: \"kubernetes.io/projected/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-kube-api-access-ndklf\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.676796 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-utilities\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.676843 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-catalog-content\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.697252 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndklf\" (UniqueName: \"kubernetes.io/projected/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-kube-api-access-ndklf\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.778932 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-utilities\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.779069 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-catalog-content\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.779096 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl7lg\" (UniqueName: \"kubernetes.io/projected/d82be883-fb56-4980-855b-29e3c65804f0-kube-api-access-zl7lg\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.786223 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.881182 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-utilities\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.881413 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-catalog-content\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.881453 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl7lg\" (UniqueName: \"kubernetes.io/projected/d82be883-fb56-4980-855b-29e3c65804f0-kube-api-access-zl7lg\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.881712 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-utilities\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.881825 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-catalog-content\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.897427 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl7lg\" (UniqueName: \"kubernetes.io/projected/d82be883-fb56-4980-855b-29e3c65804f0-kube-api-access-zl7lg\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.977165 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:36 crc kubenswrapper[4883]: I0310 09:41:36.272768 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-df97s"] Mar 10 09:41:36 crc kubenswrapper[4883]: I0310 09:41:36.443871 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2s8h8"] Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.224864 4883 generic.go:334] "Generic (PLEG): container finished" podID="d82be883-fb56-4980-855b-29e3c65804f0" containerID="3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2" exitCode=0 Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.224993 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerDied","Data":"3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2"} Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.225036 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerStarted","Data":"9e2949119360f4c1ef4ce52ba4f41c2ada91976532782b690ae2143396e06593"} Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.227235 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.227536 4883 generic.go:334] "Generic (PLEG): container finished" podID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerID="ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593" exitCode=0 Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.227626 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df97s" event={"ID":"109ee6ef-7197-40d5-82cf-4b34bcad2ecc","Type":"ContainerDied","Data":"ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593"} Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.227821 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df97s" event={"ID":"109ee6ef-7197-40d5-82cf-4b34bcad2ecc","Type":"ContainerStarted","Data":"451f307721bd95bb6eec1741406d0e7d216607c63a0692d2884bbb664bacc51f"} Mar 10 09:41:38 crc kubenswrapper[4883]: I0310 09:41:38.242856 4883 generic.go:334] "Generic (PLEG): container finished" podID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerID="75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8" exitCode=0 Mar 10 09:41:38 crc kubenswrapper[4883]: I0310 09:41:38.242956 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df97s" event={"ID":"109ee6ef-7197-40d5-82cf-4b34bcad2ecc","Type":"ContainerDied","Data":"75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8"} Mar 10 09:41:39 crc kubenswrapper[4883]: I0310 09:41:39.258160 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df97s" event={"ID":"109ee6ef-7197-40d5-82cf-4b34bcad2ecc","Type":"ContainerStarted","Data":"3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0"} Mar 10 09:41:39 crc kubenswrapper[4883]: I0310 09:41:39.261425 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerStarted","Data":"8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5"} Mar 10 09:41:39 crc kubenswrapper[4883]: I0310 09:41:39.284609 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-df97s" podStartSLOduration=2.804009525 podStartE2EDuration="4.284597673s" podCreationTimestamp="2026-03-10 09:41:35 +0000 UTC" firstStartedPulling="2026-03-10 09:41:37.229797687 +0000 UTC m=+2283.484695577" lastFinishedPulling="2026-03-10 09:41:38.710385835 +0000 UTC m=+2284.965283725" observedRunningTime="2026-03-10 09:41:39.281296206 +0000 UTC m=+2285.536194096" watchObservedRunningTime="2026-03-10 09:41:39.284597673 +0000 UTC m=+2285.539495563" Mar 10 09:41:41 crc kubenswrapper[4883]: I0310 09:41:41.282813 4883 generic.go:334] "Generic (PLEG): container finished" podID="d82be883-fb56-4980-855b-29e3c65804f0" containerID="8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5" exitCode=0 Mar 10 09:41:41 crc kubenswrapper[4883]: I0310 09:41:41.282946 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerDied","Data":"8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5"} Mar 10 09:41:42 crc kubenswrapper[4883]: I0310 09:41:42.294454 4883 generic.go:334] "Generic (PLEG): container finished" podID="af134b73-8c24-4b9e-b15e-48ff4b83ecd4" containerID="4951ccf796499d6b47c904d8481fa70523d42eff658bee0de028c1741f6614b5" exitCode=0 Mar 10 09:41:42 crc kubenswrapper[4883]: I0310 09:41:42.294546 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" event={"ID":"af134b73-8c24-4b9e-b15e-48ff4b83ecd4","Type":"ContainerDied","Data":"4951ccf796499d6b47c904d8481fa70523d42eff658bee0de028c1741f6614b5"} Mar 10 09:41:42 crc kubenswrapper[4883]: I0310 09:41:42.298785 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerStarted","Data":"ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468"} Mar 10 09:41:42 crc kubenswrapper[4883]: I0310 09:41:42.330017 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2s8h8" podStartSLOduration=2.744273642 podStartE2EDuration="7.330002342s" podCreationTimestamp="2026-03-10 09:41:35 +0000 UTC" firstStartedPulling="2026-03-10 09:41:37.226996625 +0000 UTC m=+2283.481894514" lastFinishedPulling="2026-03-10 09:41:41.812725325 +0000 UTC m=+2288.067623214" observedRunningTime="2026-03-10 09:41:42.325796621 +0000 UTC m=+2288.580694510" watchObservedRunningTime="2026-03-10 09:41:42.330002342 +0000 UTC m=+2288.584900231" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.675865 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.854928 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd8p2\" (UniqueName: \"kubernetes.io/projected/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-kube-api-access-jd8p2\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855026 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-1\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855062 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-0\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855181 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-0\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855251 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-2\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855414 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-combined-ca-bundle\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855448 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-inventory\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855568 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-extra-config-0\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855629 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-1\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855724 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-3\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855769 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-ssh-key-openstack-edpm-ipam\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.865150 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-kube-api-access-jd8p2" (OuterVolumeSpecName: "kube-api-access-jd8p2") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "kube-api-access-jd8p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.877276 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.881232 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.882265 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.882811 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.884736 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.886923 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.887374 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.891726 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.893866 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.897390 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-inventory" (OuterVolumeSpecName: "inventory") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960125 4883 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960156 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960168 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd8p2\" (UniqueName: \"kubernetes.io/projected/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-kube-api-access-jd8p2\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960182 4883 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960192 4883 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960201 4883 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960213 4883 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960221 4883 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960232 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960245 4883 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960254 4883 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.104321 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:41:44 crc kubenswrapper[4883]: E0310 09:41:44.104982 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.315169 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" event={"ID":"af134b73-8c24-4b9e-b15e-48ff4b83ecd4","Type":"ContainerDied","Data":"3f29108c5dad1a5578dac32057489e322fee1b450f1de55af05d05420ce128d5"} Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.315630 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f29108c5dad1a5578dac32057489e322fee1b450f1de55af05d05420ce128d5" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.315238 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.407861 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56"] Mar 10 09:41:44 crc kubenswrapper[4883]: E0310 09:41:44.408304 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af134b73-8c24-4b9e-b15e-48ff4b83ecd4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.408324 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="af134b73-8c24-4b9e-b15e-48ff4b83ecd4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.408544 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="af134b73-8c24-4b9e-b15e-48ff4b83ecd4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.409256 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.413254 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.413271 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.413498 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.413652 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.413800 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.420934 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56"] Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471034 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471110 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471268 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471327 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471384 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471442 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm9lt\" (UniqueName: \"kubernetes.io/projected/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-kube-api-access-lm9lt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471579 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573150 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm9lt\" (UniqueName: \"kubernetes.io/projected/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-kube-api-access-lm9lt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573253 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573302 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573329 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573409 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573447 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573503 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.578276 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.578974 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.579270 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.579469 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.579760 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.582344 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.590325 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm9lt\" (UniqueName: \"kubernetes.io/projected/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-kube-api-access-lm9lt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.728204 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.247685 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56"] Mar 10 09:41:45 crc kubenswrapper[4883]: W0310 09:41:45.250072 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb083d3b3_edb7_4d2f_a7b7_f1275bd83fde.slice/crio-63a049843d7dd2e2c005a5680366d35338216e5278e83ba6e24e822f31c706ce WatchSource:0}: Error finding container 63a049843d7dd2e2c005a5680366d35338216e5278e83ba6e24e822f31c706ce: Status 404 returned error can't find the container with id 63a049843d7dd2e2c005a5680366d35338216e5278e83ba6e24e822f31c706ce Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.323256 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" event={"ID":"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde","Type":"ContainerStarted","Data":"63a049843d7dd2e2c005a5680366d35338216e5278e83ba6e24e822f31c706ce"} Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.786587 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.787442 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.830503 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.977701 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.977756 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:46 crc kubenswrapper[4883]: I0310 09:41:46.332772 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" event={"ID":"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde","Type":"ContainerStarted","Data":"6fc9eb7a0760205fbfc253c3603eaaef5b472d6ed42e6664b249cce521606f18"} Mar 10 09:41:46 crc kubenswrapper[4883]: I0310 09:41:46.357841 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" podStartSLOduration=1.677354179 podStartE2EDuration="2.357819054s" podCreationTimestamp="2026-03-10 09:41:44 +0000 UTC" firstStartedPulling="2026-03-10 09:41:45.252626725 +0000 UTC m=+2291.507524605" lastFinishedPulling="2026-03-10 09:41:45.933091592 +0000 UTC m=+2292.187989480" observedRunningTime="2026-03-10 09:41:46.353019403 +0000 UTC m=+2292.607917292" watchObservedRunningTime="2026-03-10 09:41:46.357819054 +0000 UTC m=+2292.612716943" Mar 10 09:41:46 crc kubenswrapper[4883]: I0310 09:41:46.378914 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:46 crc kubenswrapper[4883]: I0310 09:41:46.431835 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-df97s"] Mar 10 09:41:47 crc kubenswrapper[4883]: I0310 09:41:47.018879 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2s8h8" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="registry-server" probeResult="failure" output=< Mar 10 09:41:47 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:41:47 crc kubenswrapper[4883]: > Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.349336 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-df97s" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="registry-server" containerID="cri-o://3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0" gracePeriod=2 Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.737374 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.870017 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-catalog-content\") pod \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.870190 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndklf\" (UniqueName: \"kubernetes.io/projected/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-kube-api-access-ndklf\") pod \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.870247 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-utilities\") pod \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.870929 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-utilities" (OuterVolumeSpecName: "utilities") pod "109ee6ef-7197-40d5-82cf-4b34bcad2ecc" (UID: "109ee6ef-7197-40d5-82cf-4b34bcad2ecc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.871356 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.876673 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-kube-api-access-ndklf" (OuterVolumeSpecName: "kube-api-access-ndklf") pod "109ee6ef-7197-40d5-82cf-4b34bcad2ecc" (UID: "109ee6ef-7197-40d5-82cf-4b34bcad2ecc"). InnerVolumeSpecName "kube-api-access-ndklf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.915487 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "109ee6ef-7197-40d5-82cf-4b34bcad2ecc" (UID: "109ee6ef-7197-40d5-82cf-4b34bcad2ecc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.973799 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.973831 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndklf\" (UniqueName: \"kubernetes.io/projected/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-kube-api-access-ndklf\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.367320 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.367324 4883 generic.go:334] "Generic (PLEG): container finished" podID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerID="3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0" exitCode=0 Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.367315 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df97s" event={"ID":"109ee6ef-7197-40d5-82cf-4b34bcad2ecc","Type":"ContainerDied","Data":"3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0"} Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.367512 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df97s" event={"ID":"109ee6ef-7197-40d5-82cf-4b34bcad2ecc","Type":"ContainerDied","Data":"451f307721bd95bb6eec1741406d0e7d216607c63a0692d2884bbb664bacc51f"} Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.367543 4883 scope.go:117] "RemoveContainer" containerID="3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.406396 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-df97s"] Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.409823 4883 scope.go:117] "RemoveContainer" containerID="75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.419338 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-df97s"] Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.436251 4883 scope.go:117] "RemoveContainer" containerID="ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.473123 4883 scope.go:117] "RemoveContainer" containerID="3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0" Mar 10 09:41:49 crc kubenswrapper[4883]: E0310 09:41:49.473648 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0\": container with ID starting with 3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0 not found: ID does not exist" containerID="3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.473691 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0"} err="failed to get container status \"3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0\": rpc error: code = NotFound desc = could not find container \"3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0\": container with ID starting with 3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0 not found: ID does not exist" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.473729 4883 scope.go:117] "RemoveContainer" containerID="75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8" Mar 10 09:41:49 crc kubenswrapper[4883]: E0310 09:41:49.474345 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8\": container with ID starting with 75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8 not found: ID does not exist" containerID="75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.474405 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8"} err="failed to get container status \"75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8\": rpc error: code = NotFound desc = could not find container \"75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8\": container with ID starting with 75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8 not found: ID does not exist" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.474446 4883 scope.go:117] "RemoveContainer" containerID="ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593" Mar 10 09:41:49 crc kubenswrapper[4883]: E0310 09:41:49.474936 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593\": container with ID starting with ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593 not found: ID does not exist" containerID="ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.474967 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593"} err="failed to get container status \"ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593\": rpc error: code = NotFound desc = could not find container \"ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593\": container with ID starting with ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593 not found: ID does not exist" Mar 10 09:41:50 crc kubenswrapper[4883]: I0310 09:41:50.091774 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" path="/var/lib/kubelet/pods/109ee6ef-7197-40d5-82cf-4b34bcad2ecc/volumes" Mar 10 09:41:56 crc kubenswrapper[4883]: I0310 09:41:56.020295 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:56 crc kubenswrapper[4883]: I0310 09:41:56.065792 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:56 crc kubenswrapper[4883]: I0310 09:41:56.084764 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:41:56 crc kubenswrapper[4883]: E0310 09:41:56.085152 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.055418 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2s8h8"] Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.445280 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2s8h8" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="registry-server" containerID="cri-o://ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468" gracePeriod=2 Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.845672 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.966116 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-utilities\") pod \"d82be883-fb56-4980-855b-29e3c65804f0\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.966272 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl7lg\" (UniqueName: \"kubernetes.io/projected/d82be883-fb56-4980-855b-29e3c65804f0-kube-api-access-zl7lg\") pod \"d82be883-fb56-4980-855b-29e3c65804f0\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.966538 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-catalog-content\") pod \"d82be883-fb56-4980-855b-29e3c65804f0\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.966958 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-utilities" (OuterVolumeSpecName: "utilities") pod "d82be883-fb56-4980-855b-29e3c65804f0" (UID: "d82be883-fb56-4980-855b-29e3c65804f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.967086 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.972980 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d82be883-fb56-4980-855b-29e3c65804f0-kube-api-access-zl7lg" (OuterVolumeSpecName: "kube-api-access-zl7lg") pod "d82be883-fb56-4980-855b-29e3c65804f0" (UID: "d82be883-fb56-4980-855b-29e3c65804f0"). InnerVolumeSpecName "kube-api-access-zl7lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.061212 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d82be883-fb56-4980-855b-29e3c65804f0" (UID: "d82be883-fb56-4980-855b-29e3c65804f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.069510 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.069546 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl7lg\" (UniqueName: \"kubernetes.io/projected/d82be883-fb56-4980-855b-29e3c65804f0-kube-api-access-zl7lg\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.455298 4883 generic.go:334] "Generic (PLEG): container finished" podID="d82be883-fb56-4980-855b-29e3c65804f0" containerID="ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468" exitCode=0 Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.455353 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerDied","Data":"ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468"} Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.455365 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.455393 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerDied","Data":"9e2949119360f4c1ef4ce52ba4f41c2ada91976532782b690ae2143396e06593"} Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.455415 4883 scope.go:117] "RemoveContainer" containerID="ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.477033 4883 scope.go:117] "RemoveContainer" containerID="8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.481138 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2s8h8"] Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.487306 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2s8h8"] Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.495882 4883 scope.go:117] "RemoveContainer" containerID="3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.535565 4883 scope.go:117] "RemoveContainer" containerID="ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468" Mar 10 09:41:58 crc kubenswrapper[4883]: E0310 09:41:58.536175 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468\": container with ID starting with ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468 not found: ID does not exist" containerID="ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.536207 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468"} err="failed to get container status \"ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468\": rpc error: code = NotFound desc = could not find container \"ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468\": container with ID starting with ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468 not found: ID does not exist" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.536230 4883 scope.go:117] "RemoveContainer" containerID="8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5" Mar 10 09:41:58 crc kubenswrapper[4883]: E0310 09:41:58.536748 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5\": container with ID starting with 8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5 not found: ID does not exist" containerID="8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.536790 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5"} err="failed to get container status \"8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5\": rpc error: code = NotFound desc = could not find container \"8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5\": container with ID starting with 8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5 not found: ID does not exist" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.536821 4883 scope.go:117] "RemoveContainer" containerID="3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2" Mar 10 09:41:58 crc kubenswrapper[4883]: E0310 09:41:58.537257 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2\": container with ID starting with 3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2 not found: ID does not exist" containerID="3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.537285 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2"} err="failed to get container status \"3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2\": rpc error: code = NotFound desc = could not find container \"3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2\": container with ID starting with 3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2 not found: ID does not exist" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.104901 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d82be883-fb56-4980-855b-29e3c65804f0" path="/var/lib/kubelet/pods/d82be883-fb56-4980-855b-29e3c65804f0/volumes" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.137752 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552262-5shjs"] Mar 10 09:42:00 crc kubenswrapper[4883]: E0310 09:42:00.138195 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="extract-content" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138210 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="extract-content" Mar 10 09:42:00 crc kubenswrapper[4883]: E0310 09:42:00.138236 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="extract-content" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138243 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="extract-content" Mar 10 09:42:00 crc kubenswrapper[4883]: E0310 09:42:00.138260 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="registry-server" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138266 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="registry-server" Mar 10 09:42:00 crc kubenswrapper[4883]: E0310 09:42:00.138288 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="registry-server" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138294 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="registry-server" Mar 10 09:42:00 crc kubenswrapper[4883]: E0310 09:42:00.138307 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="extract-utilities" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138313 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="extract-utilities" Mar 10 09:42:00 crc kubenswrapper[4883]: E0310 09:42:00.138333 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="extract-utilities" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138339 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="extract-utilities" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138603 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="registry-server" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138614 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="registry-server" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.139407 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.141782 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.141954 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.142428 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.149610 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552262-5shjs"] Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.213232 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x987\" (UniqueName: \"kubernetes.io/projected/5a7f73ce-4bec-451b-8fc7-a787366b6001-kube-api-access-6x987\") pod \"auto-csr-approver-29552262-5shjs\" (UID: \"5a7f73ce-4bec-451b-8fc7-a787366b6001\") " pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.315560 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x987\" (UniqueName: \"kubernetes.io/projected/5a7f73ce-4bec-451b-8fc7-a787366b6001-kube-api-access-6x987\") pod \"auto-csr-approver-29552262-5shjs\" (UID: \"5a7f73ce-4bec-451b-8fc7-a787366b6001\") " pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.344018 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x987\" (UniqueName: \"kubernetes.io/projected/5a7f73ce-4bec-451b-8fc7-a787366b6001-kube-api-access-6x987\") pod \"auto-csr-approver-29552262-5shjs\" (UID: \"5a7f73ce-4bec-451b-8fc7-a787366b6001\") " pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.458838 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.885114 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552262-5shjs"] Mar 10 09:42:00 crc kubenswrapper[4883]: W0310 09:42:00.888913 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a7f73ce_4bec_451b_8fc7_a787366b6001.slice/crio-3ee1abec5a1d75f23926eaa3527591ce75a911a2ab2367282c8a5a4b675ad265 WatchSource:0}: Error finding container 3ee1abec5a1d75f23926eaa3527591ce75a911a2ab2367282c8a5a4b675ad265: Status 404 returned error can't find the container with id 3ee1abec5a1d75f23926eaa3527591ce75a911a2ab2367282c8a5a4b675ad265 Mar 10 09:42:01 crc kubenswrapper[4883]: I0310 09:42:01.484514 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552262-5shjs" event={"ID":"5a7f73ce-4bec-451b-8fc7-a787366b6001","Type":"ContainerStarted","Data":"3ee1abec5a1d75f23926eaa3527591ce75a911a2ab2367282c8a5a4b675ad265"} Mar 10 09:42:02 crc kubenswrapper[4883]: I0310 09:42:02.493025 4883 generic.go:334] "Generic (PLEG): container finished" podID="5a7f73ce-4bec-451b-8fc7-a787366b6001" containerID="cbfa56661b259829721fddd0e6afd1c57e6fb0cd59f60d7658cb1739ef3cba81" exitCode=0 Mar 10 09:42:02 crc kubenswrapper[4883]: I0310 09:42:02.493138 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552262-5shjs" event={"ID":"5a7f73ce-4bec-451b-8fc7-a787366b6001","Type":"ContainerDied","Data":"cbfa56661b259829721fddd0e6afd1c57e6fb0cd59f60d7658cb1739ef3cba81"} Mar 10 09:42:03 crc kubenswrapper[4883]: I0310 09:42:03.802607 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:03 crc kubenswrapper[4883]: I0310 09:42:03.995205 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x987\" (UniqueName: \"kubernetes.io/projected/5a7f73ce-4bec-451b-8fc7-a787366b6001-kube-api-access-6x987\") pod \"5a7f73ce-4bec-451b-8fc7-a787366b6001\" (UID: \"5a7f73ce-4bec-451b-8fc7-a787366b6001\") " Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.001694 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a7f73ce-4bec-451b-8fc7-a787366b6001-kube-api-access-6x987" (OuterVolumeSpecName: "kube-api-access-6x987") pod "5a7f73ce-4bec-451b-8fc7-a787366b6001" (UID: "5a7f73ce-4bec-451b-8fc7-a787366b6001"). InnerVolumeSpecName "kube-api-access-6x987". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.098466 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x987\" (UniqueName: \"kubernetes.io/projected/5a7f73ce-4bec-451b-8fc7-a787366b6001-kube-api-access-6x987\") on node \"crc\" DevicePath \"\"" Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.513646 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552262-5shjs" event={"ID":"5a7f73ce-4bec-451b-8fc7-a787366b6001","Type":"ContainerDied","Data":"3ee1abec5a1d75f23926eaa3527591ce75a911a2ab2367282c8a5a4b675ad265"} Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.513695 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.513716 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ee1abec5a1d75f23926eaa3527591ce75a911a2ab2367282c8a5a4b675ad265" Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.879608 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552256-fmqzj"] Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.887255 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552256-fmqzj"] Mar 10 09:42:06 crc kubenswrapper[4883]: I0310 09:42:06.090312 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70e8b0b-51ad-4080-8955-8aa8ee68f274" path="/var/lib/kubelet/pods/c70e8b0b-51ad-4080-8955-8aa8ee68f274/volumes" Mar 10 09:42:08 crc kubenswrapper[4883]: I0310 09:42:08.080282 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:42:08 crc kubenswrapper[4883]: E0310 09:42:08.081236 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:42:20 crc kubenswrapper[4883]: I0310 09:42:20.080005 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:42:20 crc kubenswrapper[4883]: E0310 09:42:20.080953 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:42:33 crc kubenswrapper[4883]: I0310 09:42:33.080591 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:42:33 crc kubenswrapper[4883]: E0310 09:42:33.081387 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:42:45 crc kubenswrapper[4883]: I0310 09:42:45.080435 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:42:45 crc kubenswrapper[4883]: E0310 09:42:45.081408 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:42:56 crc kubenswrapper[4883]: I0310 09:42:56.080470 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:42:56 crc kubenswrapper[4883]: E0310 09:42:56.081732 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:43:06 crc kubenswrapper[4883]: I0310 09:43:06.008189 4883 scope.go:117] "RemoveContainer" containerID="f3c4bd29935ee27cc39cdc84c3f636e36dc2e8b9c1bfc85b30473304c9e1a2ff" Mar 10 09:43:09 crc kubenswrapper[4883]: I0310 09:43:09.080360 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:43:09 crc kubenswrapper[4883]: E0310 09:43:09.081621 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:43:22 crc kubenswrapper[4883]: I0310 09:43:22.079563 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:43:22 crc kubenswrapper[4883]: E0310 09:43:22.080493 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:43:36 crc kubenswrapper[4883]: I0310 09:43:36.079493 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:43:36 crc kubenswrapper[4883]: E0310 09:43:36.080412 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:43:38 crc kubenswrapper[4883]: I0310 09:43:38.314510 4883 generic.go:334] "Generic (PLEG): container finished" podID="b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" containerID="6fc9eb7a0760205fbfc253c3603eaaef5b472d6ed42e6664b249cce521606f18" exitCode=0 Mar 10 09:43:38 crc kubenswrapper[4883]: I0310 09:43:38.314585 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" event={"ID":"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde","Type":"ContainerDied","Data":"6fc9eb7a0760205fbfc253c3603eaaef5b472d6ed42e6664b249cce521606f18"} Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.644856 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.719953 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-1\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.720000 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ssh-key-openstack-edpm-ipam\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.742997 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.743740 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821164 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-telemetry-combined-ca-bundle\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821246 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-inventory\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821300 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-0\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821331 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm9lt\" (UniqueName: \"kubernetes.io/projected/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-kube-api-access-lm9lt\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821362 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-2\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821677 4883 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821694 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.823967 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-kube-api-access-lm9lt" (OuterVolumeSpecName: "kube-api-access-lm9lt") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "kube-api-access-lm9lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.824170 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.839888 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.840069 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.841092 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-inventory" (OuterVolumeSpecName: "inventory") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.924242 4883 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.924532 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.924543 4883 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.924553 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm9lt\" (UniqueName: \"kubernetes.io/projected/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-kube-api-access-lm9lt\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.924562 4883 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:40 crc kubenswrapper[4883]: I0310 09:43:40.332347 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" event={"ID":"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde","Type":"ContainerDied","Data":"63a049843d7dd2e2c005a5680366d35338216e5278e83ba6e24e822f31c706ce"} Mar 10 09:43:40 crc kubenswrapper[4883]: I0310 09:43:40.332616 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63a049843d7dd2e2c005a5680366d35338216e5278e83ba6e24e822f31c706ce" Mar 10 09:43:40 crc kubenswrapper[4883]: I0310 09:43:40.332396 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:43:47 crc kubenswrapper[4883]: I0310 09:43:47.080645 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:43:47 crc kubenswrapper[4883]: E0310 09:43:47.081768 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.080369 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:44:00 crc kubenswrapper[4883]: E0310 09:44:00.081319 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.139522 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552264-fxbzh"] Mar 10 09:44:00 crc kubenswrapper[4883]: E0310 09:44:00.139985 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.140006 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 09:44:00 crc kubenswrapper[4883]: E0310 09:44:00.140017 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7f73ce-4bec-451b-8fc7-a787366b6001" containerName="oc" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.140023 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7f73ce-4bec-451b-8fc7-a787366b6001" containerName="oc" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.140227 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.140251 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a7f73ce-4bec-451b-8fc7-a787366b6001" containerName="oc" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.140959 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.143973 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.144139 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.144634 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.149499 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552264-fxbzh"] Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.177337 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8h9j\" (UniqueName: \"kubernetes.io/projected/e7561a55-c8cc-4fad-99cf-6a81612efa5f-kube-api-access-m8h9j\") pod \"auto-csr-approver-29552264-fxbzh\" (UID: \"e7561a55-c8cc-4fad-99cf-6a81612efa5f\") " pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.279911 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8h9j\" (UniqueName: \"kubernetes.io/projected/e7561a55-c8cc-4fad-99cf-6a81612efa5f-kube-api-access-m8h9j\") pod \"auto-csr-approver-29552264-fxbzh\" (UID: \"e7561a55-c8cc-4fad-99cf-6a81612efa5f\") " pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.298317 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8h9j\" (UniqueName: \"kubernetes.io/projected/e7561a55-c8cc-4fad-99cf-6a81612efa5f-kube-api-access-m8h9j\") pod \"auto-csr-approver-29552264-fxbzh\" (UID: \"e7561a55-c8cc-4fad-99cf-6a81612efa5f\") " pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.459822 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.888918 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552264-fxbzh"] Mar 10 09:44:00 crc kubenswrapper[4883]: W0310 09:44:00.898963 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7561a55_c8cc_4fad_99cf_6a81612efa5f.slice/crio-944177a2bfa8d0029ced0a2644b9188cbe3400a6beb4e31b0d0458119eda92e7 WatchSource:0}: Error finding container 944177a2bfa8d0029ced0a2644b9188cbe3400a6beb4e31b0d0458119eda92e7: Status 404 returned error can't find the container with id 944177a2bfa8d0029ced0a2644b9188cbe3400a6beb4e31b0d0458119eda92e7 Mar 10 09:44:01 crc kubenswrapper[4883]: I0310 09:44:01.512129 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" event={"ID":"e7561a55-c8cc-4fad-99cf-6a81612efa5f","Type":"ContainerStarted","Data":"944177a2bfa8d0029ced0a2644b9188cbe3400a6beb4e31b0d0458119eda92e7"} Mar 10 09:44:02 crc kubenswrapper[4883]: I0310 09:44:02.532141 4883 generic.go:334] "Generic (PLEG): container finished" podID="e7561a55-c8cc-4fad-99cf-6a81612efa5f" containerID="7a20c7b029586fbd175b90818495ae7de2932811c342c11864ee48f481c0032f" exitCode=0 Mar 10 09:44:02 crc kubenswrapper[4883]: I0310 09:44:02.532235 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" event={"ID":"e7561a55-c8cc-4fad-99cf-6a81612efa5f","Type":"ContainerDied","Data":"7a20c7b029586fbd175b90818495ae7de2932811c342c11864ee48f481c0032f"} Mar 10 09:44:03 crc kubenswrapper[4883]: I0310 09:44:03.830727 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:03 crc kubenswrapper[4883]: I0310 09:44:03.962105 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8h9j\" (UniqueName: \"kubernetes.io/projected/e7561a55-c8cc-4fad-99cf-6a81612efa5f-kube-api-access-m8h9j\") pod \"e7561a55-c8cc-4fad-99cf-6a81612efa5f\" (UID: \"e7561a55-c8cc-4fad-99cf-6a81612efa5f\") " Mar 10 09:44:03 crc kubenswrapper[4883]: I0310 09:44:03.973759 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7561a55-c8cc-4fad-99cf-6a81612efa5f-kube-api-access-m8h9j" (OuterVolumeSpecName: "kube-api-access-m8h9j") pod "e7561a55-c8cc-4fad-99cf-6a81612efa5f" (UID: "e7561a55-c8cc-4fad-99cf-6a81612efa5f"). InnerVolumeSpecName "kube-api-access-m8h9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:44:04 crc kubenswrapper[4883]: I0310 09:44:04.065305 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8h9j\" (UniqueName: \"kubernetes.io/projected/e7561a55-c8cc-4fad-99cf-6a81612efa5f-kube-api-access-m8h9j\") on node \"crc\" DevicePath \"\"" Mar 10 09:44:04 crc kubenswrapper[4883]: I0310 09:44:04.554029 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" event={"ID":"e7561a55-c8cc-4fad-99cf-6a81612efa5f","Type":"ContainerDied","Data":"944177a2bfa8d0029ced0a2644b9188cbe3400a6beb4e31b0d0458119eda92e7"} Mar 10 09:44:04 crc kubenswrapper[4883]: I0310 09:44:04.554347 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="944177a2bfa8d0029ced0a2644b9188cbe3400a6beb4e31b0d0458119eda92e7" Mar 10 09:44:04 crc kubenswrapper[4883]: I0310 09:44:04.554091 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:04 crc kubenswrapper[4883]: I0310 09:44:04.892907 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552258-6pf25"] Mar 10 09:44:04 crc kubenswrapper[4883]: I0310 09:44:04.903866 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552258-6pf25"] Mar 10 09:44:06 crc kubenswrapper[4883]: I0310 09:44:06.092020 4883 scope.go:117] "RemoveContainer" containerID="0172a22ca1b2674eaf7b22f058338bb6d8a1a070eb6300e099b6179e3eec55d7" Mar 10 09:44:06 crc kubenswrapper[4883]: I0310 09:44:06.095344 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff28fc4-3134-48ba-9697-b74e9b4e6ec5" path="/var/lib/kubelet/pods/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5/volumes" Mar 10 09:44:12 crc kubenswrapper[4883]: I0310 09:44:12.080373 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:44:12 crc kubenswrapper[4883]: E0310 09:44:12.081138 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:44:24 crc kubenswrapper[4883]: I0310 09:44:24.086231 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:44:24 crc kubenswrapper[4883]: E0310 09:44:24.087371 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.079410 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 09:44:25 crc kubenswrapper[4883]: E0310 09:44:25.080263 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7561a55-c8cc-4fad-99cf-6a81612efa5f" containerName="oc" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.080333 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7561a55-c8cc-4fad-99cf-6a81612efa5f" containerName="oc" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.080640 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7561a55-c8cc-4fad-99cf-6a81612efa5f" containerName="oc" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.081746 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.083442 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.084129 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.084179 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.085207 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fm4md" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.085322 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143384 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143460 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143545 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143584 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-config-data\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143625 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143655 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rczsq\" (UniqueName: \"kubernetes.io/projected/d483d791-15b3-49e7-8095-5660a9d0fdaa-kube-api-access-rczsq\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143710 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143735 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143767 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245550 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245630 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245670 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-config-data\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245701 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245732 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczsq\" (UniqueName: \"kubernetes.io/projected/d483d791-15b3-49e7-8095-5660a9d0fdaa-kube-api-access-rczsq\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245780 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245802 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245844 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245881 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.246577 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.246667 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.247022 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.247044 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-config-data\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.247275 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.252067 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.252878 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.254190 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.260805 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczsq\" (UniqueName: \"kubernetes.io/projected/d483d791-15b3-49e7-8095-5660a9d0fdaa-kube-api-access-rczsq\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.268007 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.404442 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.828204 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 09:44:26 crc kubenswrapper[4883]: I0310 09:44:26.750063 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d483d791-15b3-49e7-8095-5660a9d0fdaa","Type":"ContainerStarted","Data":"5e9e0098f227f9af35dc0b77276abeb28187e6a5424e5047b3daacd6cc5a8286"} Mar 10 09:44:36 crc kubenswrapper[4883]: I0310 09:44:36.081531 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:44:36 crc kubenswrapper[4883]: E0310 09:44:36.082684 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:44:49 crc kubenswrapper[4883]: I0310 09:44:49.080397 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:44:49 crc kubenswrapper[4883]: E0310 09:44:49.081362 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:44:54 crc kubenswrapper[4883]: E0310 09:44:54.579062 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 10 09:44:54 crc kubenswrapper[4883]: E0310 09:44:54.579596 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczsq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(d483d791-15b3-49e7-8095-5660a9d0fdaa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:44:54 crc kubenswrapper[4883]: E0310 09:44:54.580819 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="d483d791-15b3-49e7-8095-5660a9d0fdaa" Mar 10 09:44:55 crc kubenswrapper[4883]: E0310 09:44:55.047240 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="d483d791-15b3-49e7-8095-5660a9d0fdaa" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.148547 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7"] Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.150260 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.151451 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7"] Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.152951 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.153861 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.261238 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71905d96-5939-40cc-99ff-40da96706a63-config-volume\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.261616 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71905d96-5939-40cc-99ff-40da96706a63-secret-volume\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.262084 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xkbb\" (UniqueName: \"kubernetes.io/projected/71905d96-5939-40cc-99ff-40da96706a63-kube-api-access-9xkbb\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.364833 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xkbb\" (UniqueName: \"kubernetes.io/projected/71905d96-5939-40cc-99ff-40da96706a63-kube-api-access-9xkbb\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.364913 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71905d96-5939-40cc-99ff-40da96706a63-config-volume\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.364961 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71905d96-5939-40cc-99ff-40da96706a63-secret-volume\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.365840 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71905d96-5939-40cc-99ff-40da96706a63-config-volume\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.371862 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71905d96-5939-40cc-99ff-40da96706a63-secret-volume\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.381650 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xkbb\" (UniqueName: \"kubernetes.io/projected/71905d96-5939-40cc-99ff-40da96706a63-kube-api-access-9xkbb\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.469052 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.870227 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7"] Mar 10 09:45:01 crc kubenswrapper[4883]: I0310 09:45:01.080009 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:45:01 crc kubenswrapper[4883]: E0310 09:45:01.080236 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:45:01 crc kubenswrapper[4883]: I0310 09:45:01.094827 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" event={"ID":"71905d96-5939-40cc-99ff-40da96706a63","Type":"ContainerStarted","Data":"b70c70acb4f545566eaae90a1cb0e6aa80d1cb1b44c83724d42075e959d24dbd"} Mar 10 09:45:01 crc kubenswrapper[4883]: I0310 09:45:01.094879 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" event={"ID":"71905d96-5939-40cc-99ff-40da96706a63","Type":"ContainerStarted","Data":"450ab2e2dbe2709169f581b8f1940b8e48c5198bc1a429dda05302a678f451db"} Mar 10 09:45:01 crc kubenswrapper[4883]: I0310 09:45:01.110925 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" podStartSLOduration=1.110904033 podStartE2EDuration="1.110904033s" podCreationTimestamp="2026-03-10 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:45:01.107442426 +0000 UTC m=+2487.362340315" watchObservedRunningTime="2026-03-10 09:45:01.110904033 +0000 UTC m=+2487.365801922" Mar 10 09:45:02 crc kubenswrapper[4883]: I0310 09:45:02.109561 4883 generic.go:334] "Generic (PLEG): container finished" podID="71905d96-5939-40cc-99ff-40da96706a63" containerID="b70c70acb4f545566eaae90a1cb0e6aa80d1cb1b44c83724d42075e959d24dbd" exitCode=0 Mar 10 09:45:02 crc kubenswrapper[4883]: I0310 09:45:02.109881 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" event={"ID":"71905d96-5939-40cc-99ff-40da96706a63","Type":"ContainerDied","Data":"b70c70acb4f545566eaae90a1cb0e6aa80d1cb1b44c83724d42075e959d24dbd"} Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.410221 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.539943 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71905d96-5939-40cc-99ff-40da96706a63-config-volume\") pod \"71905d96-5939-40cc-99ff-40da96706a63\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.540034 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71905d96-5939-40cc-99ff-40da96706a63-secret-volume\") pod \"71905d96-5939-40cc-99ff-40da96706a63\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.540066 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xkbb\" (UniqueName: \"kubernetes.io/projected/71905d96-5939-40cc-99ff-40da96706a63-kube-api-access-9xkbb\") pod \"71905d96-5939-40cc-99ff-40da96706a63\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.540539 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71905d96-5939-40cc-99ff-40da96706a63-config-volume" (OuterVolumeSpecName: "config-volume") pod "71905d96-5939-40cc-99ff-40da96706a63" (UID: "71905d96-5939-40cc-99ff-40da96706a63"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.546974 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71905d96-5939-40cc-99ff-40da96706a63-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "71905d96-5939-40cc-99ff-40da96706a63" (UID: "71905d96-5939-40cc-99ff-40da96706a63"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.547039 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71905d96-5939-40cc-99ff-40da96706a63-kube-api-access-9xkbb" (OuterVolumeSpecName: "kube-api-access-9xkbb") pod "71905d96-5939-40cc-99ff-40da96706a63" (UID: "71905d96-5939-40cc-99ff-40da96706a63"). InnerVolumeSpecName "kube-api-access-9xkbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.643263 4883 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71905d96-5939-40cc-99ff-40da96706a63-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.643558 4883 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71905d96-5939-40cc-99ff-40da96706a63-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.643572 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xkbb\" (UniqueName: \"kubernetes.io/projected/71905d96-5939-40cc-99ff-40da96706a63-kube-api-access-9xkbb\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:04 crc kubenswrapper[4883]: I0310 09:45:04.127708 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" event={"ID":"71905d96-5939-40cc-99ff-40da96706a63","Type":"ContainerDied","Data":"450ab2e2dbe2709169f581b8f1940b8e48c5198bc1a429dda05302a678f451db"} Mar 10 09:45:04 crc kubenswrapper[4883]: I0310 09:45:04.127754 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="450ab2e2dbe2709169f581b8f1940b8e48c5198bc1a429dda05302a678f451db" Mar 10 09:45:04 crc kubenswrapper[4883]: I0310 09:45:04.127806 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:04 crc kubenswrapper[4883]: I0310 09:45:04.482223 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g"] Mar 10 09:45:04 crc kubenswrapper[4883]: I0310 09:45:04.488658 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g"] Mar 10 09:45:06 crc kubenswrapper[4883]: I0310 09:45:06.090639 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be14f8e-b9d8-4058-9be3-cdc61ce88626" path="/var/lib/kubelet/pods/0be14f8e-b9d8-4058-9be3-cdc61ce88626/volumes" Mar 10 09:45:06 crc kubenswrapper[4883]: I0310 09:45:06.170700 4883 scope.go:117] "RemoveContainer" containerID="d15aaac8a80d44b120c58eb49e71ed5d06a04daa936e43f5e09fe3fbce5d0142" Mar 10 09:45:07 crc kubenswrapper[4883]: I0310 09:45:07.645687 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 10 09:45:09 crc kubenswrapper[4883]: I0310 09:45:09.178734 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d483d791-15b3-49e7-8095-5660a9d0fdaa","Type":"ContainerStarted","Data":"20faf1bc2dd52b1aabee2636feb1570644b5e51b82c37399b21f107d33a5382f"} Mar 10 09:45:09 crc kubenswrapper[4883]: I0310 09:45:09.198971 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.3887127870000002 podStartE2EDuration="45.198932078s" podCreationTimestamp="2026-03-10 09:44:24 +0000 UTC" firstStartedPulling="2026-03-10 09:44:25.832184966 +0000 UTC m=+2452.087082855" lastFinishedPulling="2026-03-10 09:45:07.642404257 +0000 UTC m=+2493.897302146" observedRunningTime="2026-03-10 09:45:09.196741778 +0000 UTC m=+2495.451639667" watchObservedRunningTime="2026-03-10 09:45:09.198932078 +0000 UTC m=+2495.453829967" Mar 10 09:45:16 crc kubenswrapper[4883]: I0310 09:45:16.079715 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:45:16 crc kubenswrapper[4883]: E0310 09:45:16.080656 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:45:30 crc kubenswrapper[4883]: I0310 09:45:30.080170 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:45:30 crc kubenswrapper[4883]: E0310 09:45:30.080972 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:45:44 crc kubenswrapper[4883]: I0310 09:45:44.086439 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:45:44 crc kubenswrapper[4883]: E0310 09:45:44.087301 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:45:59 crc kubenswrapper[4883]: I0310 09:45:59.080009 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:45:59 crc kubenswrapper[4883]: E0310 09:45:59.080803 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.136694 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552266-685zg"] Mar 10 09:46:00 crc kubenswrapper[4883]: E0310 09:46:00.137158 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71905d96-5939-40cc-99ff-40da96706a63" containerName="collect-profiles" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.137174 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="71905d96-5939-40cc-99ff-40da96706a63" containerName="collect-profiles" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.137373 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="71905d96-5939-40cc-99ff-40da96706a63" containerName="collect-profiles" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.138087 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.139996 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.140169 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.142767 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.143631 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552266-685zg"] Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.313295 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pjq9\" (UniqueName: \"kubernetes.io/projected/3a46f17b-70fa-415b-a58a-05fabe683062-kube-api-access-6pjq9\") pod \"auto-csr-approver-29552266-685zg\" (UID: \"3a46f17b-70fa-415b-a58a-05fabe683062\") " pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.415716 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pjq9\" (UniqueName: \"kubernetes.io/projected/3a46f17b-70fa-415b-a58a-05fabe683062-kube-api-access-6pjq9\") pod \"auto-csr-approver-29552266-685zg\" (UID: \"3a46f17b-70fa-415b-a58a-05fabe683062\") " pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.432525 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pjq9\" (UniqueName: \"kubernetes.io/projected/3a46f17b-70fa-415b-a58a-05fabe683062-kube-api-access-6pjq9\") pod \"auto-csr-approver-29552266-685zg\" (UID: \"3a46f17b-70fa-415b-a58a-05fabe683062\") " pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.456951 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.852244 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552266-685zg"] Mar 10 09:46:01 crc kubenswrapper[4883]: I0310 09:46:01.621712 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552266-685zg" event={"ID":"3a46f17b-70fa-415b-a58a-05fabe683062","Type":"ContainerStarted","Data":"d9b18f5de6dd8fcb31bd49f075439310a45028a1667415604446e36a280624de"} Mar 10 09:46:02 crc kubenswrapper[4883]: I0310 09:46:02.633726 4883 generic.go:334] "Generic (PLEG): container finished" podID="3a46f17b-70fa-415b-a58a-05fabe683062" containerID="8f1431de5f428e41dddc47217c2e968809dd1f4b2b6ca77bcaf70fa3ca340a9d" exitCode=0 Mar 10 09:46:02 crc kubenswrapper[4883]: I0310 09:46:02.633851 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552266-685zg" event={"ID":"3a46f17b-70fa-415b-a58a-05fabe683062","Type":"ContainerDied","Data":"8f1431de5f428e41dddc47217c2e968809dd1f4b2b6ca77bcaf70fa3ca340a9d"} Mar 10 09:46:03 crc kubenswrapper[4883]: I0310 09:46:03.956281 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:04 crc kubenswrapper[4883]: I0310 09:46:04.097140 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pjq9\" (UniqueName: \"kubernetes.io/projected/3a46f17b-70fa-415b-a58a-05fabe683062-kube-api-access-6pjq9\") pod \"3a46f17b-70fa-415b-a58a-05fabe683062\" (UID: \"3a46f17b-70fa-415b-a58a-05fabe683062\") " Mar 10 09:46:04 crc kubenswrapper[4883]: I0310 09:46:04.103815 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a46f17b-70fa-415b-a58a-05fabe683062-kube-api-access-6pjq9" (OuterVolumeSpecName: "kube-api-access-6pjq9") pod "3a46f17b-70fa-415b-a58a-05fabe683062" (UID: "3a46f17b-70fa-415b-a58a-05fabe683062"). InnerVolumeSpecName "kube-api-access-6pjq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:46:04 crc kubenswrapper[4883]: I0310 09:46:04.201060 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pjq9\" (UniqueName: \"kubernetes.io/projected/3a46f17b-70fa-415b-a58a-05fabe683062-kube-api-access-6pjq9\") on node \"crc\" DevicePath \"\"" Mar 10 09:46:04 crc kubenswrapper[4883]: I0310 09:46:04.654105 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552266-685zg" event={"ID":"3a46f17b-70fa-415b-a58a-05fabe683062","Type":"ContainerDied","Data":"d9b18f5de6dd8fcb31bd49f075439310a45028a1667415604446e36a280624de"} Mar 10 09:46:04 crc kubenswrapper[4883]: I0310 09:46:04.654165 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9b18f5de6dd8fcb31bd49f075439310a45028a1667415604446e36a280624de" Mar 10 09:46:04 crc kubenswrapper[4883]: I0310 09:46:04.654183 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:05 crc kubenswrapper[4883]: I0310 09:46:05.018748 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552260-m5d6m"] Mar 10 09:46:05 crc kubenswrapper[4883]: I0310 09:46:05.025172 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552260-m5d6m"] Mar 10 09:46:06 crc kubenswrapper[4883]: I0310 09:46:06.091861 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c22b88a-ce63-4c4a-a606-17c563e9e156" path="/var/lib/kubelet/pods/9c22b88a-ce63-4c4a-a606-17c563e9e156/volumes" Mar 10 09:46:06 crc kubenswrapper[4883]: I0310 09:46:06.233733 4883 scope.go:117] "RemoveContainer" containerID="43ec4b59a40b1bfb044dd60703dd00f3cac266b590c5664aa4f0403a553e3872" Mar 10 09:46:11 crc kubenswrapper[4883]: I0310 09:46:11.081305 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:46:11 crc kubenswrapper[4883]: E0310 09:46:11.082657 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:46:23 crc kubenswrapper[4883]: I0310 09:46:23.079388 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:46:23 crc kubenswrapper[4883]: I0310 09:46:23.843294 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"42db6b75ced608253f11bc996fd1ec66cf10171b52105327ab8dabc2d2a5fbef"} Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.718321 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hnmd7"] Mar 10 09:47:09 crc kubenswrapper[4883]: E0310 09:47:09.719546 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a46f17b-70fa-415b-a58a-05fabe683062" containerName="oc" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.719562 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a46f17b-70fa-415b-a58a-05fabe683062" containerName="oc" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.719777 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a46f17b-70fa-415b-a58a-05fabe683062" containerName="oc" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.721347 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.739236 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnmd7"] Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.823673 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-utilities\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.823805 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-catalog-content\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.823887 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6dl7\" (UniqueName: \"kubernetes.io/projected/c2cfe728-0af8-40ab-9378-8567163d6489-kube-api-access-p6dl7\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.926102 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-catalog-content\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.926258 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6dl7\" (UniqueName: \"kubernetes.io/projected/c2cfe728-0af8-40ab-9378-8567163d6489-kube-api-access-p6dl7\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.926335 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-utilities\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.926935 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-catalog-content\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.926986 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-utilities\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.947827 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6dl7\" (UniqueName: \"kubernetes.io/projected/c2cfe728-0af8-40ab-9378-8567163d6489-kube-api-access-p6dl7\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:10 crc kubenswrapper[4883]: I0310 09:47:10.037026 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:10 crc kubenswrapper[4883]: I0310 09:47:10.444863 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnmd7"] Mar 10 09:47:11 crc kubenswrapper[4883]: I0310 09:47:11.254957 4883 generic.go:334] "Generic (PLEG): container finished" podID="c2cfe728-0af8-40ab-9378-8567163d6489" containerID="83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00" exitCode=0 Mar 10 09:47:11 crc kubenswrapper[4883]: I0310 09:47:11.255066 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerDied","Data":"83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00"} Mar 10 09:47:11 crc kubenswrapper[4883]: I0310 09:47:11.255372 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerStarted","Data":"fb137214e3717144cbad13264c0b2115ce2602654d08f4640418db6656487038"} Mar 10 09:47:11 crc kubenswrapper[4883]: I0310 09:47:11.258271 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:47:12 crc kubenswrapper[4883]: I0310 09:47:12.265568 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerStarted","Data":"e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875"} Mar 10 09:47:13 crc kubenswrapper[4883]: I0310 09:47:13.275050 4883 generic.go:334] "Generic (PLEG): container finished" podID="c2cfe728-0af8-40ab-9378-8567163d6489" containerID="e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875" exitCode=0 Mar 10 09:47:13 crc kubenswrapper[4883]: I0310 09:47:13.275101 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerDied","Data":"e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875"} Mar 10 09:47:14 crc kubenswrapper[4883]: I0310 09:47:14.289228 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerStarted","Data":"9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480"} Mar 10 09:47:14 crc kubenswrapper[4883]: I0310 09:47:14.306697 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hnmd7" podStartSLOduration=2.807112862 podStartE2EDuration="5.306675575s" podCreationTimestamp="2026-03-10 09:47:09 +0000 UTC" firstStartedPulling="2026-03-10 09:47:11.25790419 +0000 UTC m=+2617.512802078" lastFinishedPulling="2026-03-10 09:47:13.757466892 +0000 UTC m=+2620.012364791" observedRunningTime="2026-03-10 09:47:14.306584303 +0000 UTC m=+2620.561482192" watchObservedRunningTime="2026-03-10 09:47:14.306675575 +0000 UTC m=+2620.561573465" Mar 10 09:47:20 crc kubenswrapper[4883]: I0310 09:47:20.037837 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:20 crc kubenswrapper[4883]: I0310 09:47:20.038683 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:20 crc kubenswrapper[4883]: I0310 09:47:20.089129 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:20 crc kubenswrapper[4883]: I0310 09:47:20.391409 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:20 crc kubenswrapper[4883]: I0310 09:47:20.441489 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnmd7"] Mar 10 09:47:22 crc kubenswrapper[4883]: I0310 09:47:22.371138 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hnmd7" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="registry-server" containerID="cri-o://9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480" gracePeriod=2 Mar 10 09:47:22 crc kubenswrapper[4883]: I0310 09:47:22.819785 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.022119 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-utilities\") pod \"c2cfe728-0af8-40ab-9378-8567163d6489\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.022266 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6dl7\" (UniqueName: \"kubernetes.io/projected/c2cfe728-0af8-40ab-9378-8567163d6489-kube-api-access-p6dl7\") pod \"c2cfe728-0af8-40ab-9378-8567163d6489\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.022314 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-catalog-content\") pod \"c2cfe728-0af8-40ab-9378-8567163d6489\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.022833 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-utilities" (OuterVolumeSpecName: "utilities") pod "c2cfe728-0af8-40ab-9378-8567163d6489" (UID: "c2cfe728-0af8-40ab-9378-8567163d6489"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.029803 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2cfe728-0af8-40ab-9378-8567163d6489-kube-api-access-p6dl7" (OuterVolumeSpecName: "kube-api-access-p6dl7") pod "c2cfe728-0af8-40ab-9378-8567163d6489" (UID: "c2cfe728-0af8-40ab-9378-8567163d6489"). InnerVolumeSpecName "kube-api-access-p6dl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.056079 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2cfe728-0af8-40ab-9378-8567163d6489" (UID: "c2cfe728-0af8-40ab-9378-8567163d6489"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.124737 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.124773 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.124787 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6dl7\" (UniqueName: \"kubernetes.io/projected/c2cfe728-0af8-40ab-9378-8567163d6489-kube-api-access-p6dl7\") on node \"crc\" DevicePath \"\"" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.384267 4883 generic.go:334] "Generic (PLEG): container finished" podID="c2cfe728-0af8-40ab-9378-8567163d6489" containerID="9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480" exitCode=0 Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.384449 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerDied","Data":"9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480"} Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.384713 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerDied","Data":"fb137214e3717144cbad13264c0b2115ce2602654d08f4640418db6656487038"} Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.384746 4883 scope.go:117] "RemoveContainer" containerID="9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.384604 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.405639 4883 scope.go:117] "RemoveContainer" containerID="e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.416607 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnmd7"] Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.421702 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnmd7"] Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.438370 4883 scope.go:117] "RemoveContainer" containerID="83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.456531 4883 scope.go:117] "RemoveContainer" containerID="9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480" Mar 10 09:47:23 crc kubenswrapper[4883]: E0310 09:47:23.457028 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480\": container with ID starting with 9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480 not found: ID does not exist" containerID="9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.457065 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480"} err="failed to get container status \"9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480\": rpc error: code = NotFound desc = could not find container \"9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480\": container with ID starting with 9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480 not found: ID does not exist" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.457093 4883 scope.go:117] "RemoveContainer" containerID="e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875" Mar 10 09:47:23 crc kubenswrapper[4883]: E0310 09:47:23.457385 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875\": container with ID starting with e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875 not found: ID does not exist" containerID="e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.457416 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875"} err="failed to get container status \"e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875\": rpc error: code = NotFound desc = could not find container \"e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875\": container with ID starting with e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875 not found: ID does not exist" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.457439 4883 scope.go:117] "RemoveContainer" containerID="83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00" Mar 10 09:47:23 crc kubenswrapper[4883]: E0310 09:47:23.457896 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00\": container with ID starting with 83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00 not found: ID does not exist" containerID="83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.457932 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00"} err="failed to get container status \"83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00\": rpc error: code = NotFound desc = could not find container \"83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00\": container with ID starting with 83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00 not found: ID does not exist" Mar 10 09:47:24 crc kubenswrapper[4883]: I0310 09:47:24.093780 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" path="/var/lib/kubelet/pods/c2cfe728-0af8-40ab-9378-8567163d6489/volumes" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.140978 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552268-mwcmv"] Mar 10 09:48:00 crc kubenswrapper[4883]: E0310 09:48:00.142400 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="registry-server" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.142419 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="registry-server" Mar 10 09:48:00 crc kubenswrapper[4883]: E0310 09:48:00.142440 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="extract-content" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.142447 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="extract-content" Mar 10 09:48:00 crc kubenswrapper[4883]: E0310 09:48:00.142493 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="extract-utilities" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.142500 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="extract-utilities" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.142747 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="registry-server" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.143527 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.148584 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.148628 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.148777 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.149785 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552268-mwcmv"] Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.267177 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8qm7\" (UniqueName: \"kubernetes.io/projected/4e44dc59-cae3-44ee-87bf-2b85d5850682-kube-api-access-j8qm7\") pod \"auto-csr-approver-29552268-mwcmv\" (UID: \"4e44dc59-cae3-44ee-87bf-2b85d5850682\") " pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.368624 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8qm7\" (UniqueName: \"kubernetes.io/projected/4e44dc59-cae3-44ee-87bf-2b85d5850682-kube-api-access-j8qm7\") pod \"auto-csr-approver-29552268-mwcmv\" (UID: \"4e44dc59-cae3-44ee-87bf-2b85d5850682\") " pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.386689 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8qm7\" (UniqueName: \"kubernetes.io/projected/4e44dc59-cae3-44ee-87bf-2b85d5850682-kube-api-access-j8qm7\") pod \"auto-csr-approver-29552268-mwcmv\" (UID: \"4e44dc59-cae3-44ee-87bf-2b85d5850682\") " pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.459393 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.856686 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552268-mwcmv"] Mar 10 09:48:01 crc kubenswrapper[4883]: I0310 09:48:01.728901 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" event={"ID":"4e44dc59-cae3-44ee-87bf-2b85d5850682","Type":"ContainerStarted","Data":"6394610a49512c2011f25bcc020f3a3af9a16665fd86bd626d17107c1489da0a"} Mar 10 09:48:02 crc kubenswrapper[4883]: I0310 09:48:02.739920 4883 generic.go:334] "Generic (PLEG): container finished" podID="4e44dc59-cae3-44ee-87bf-2b85d5850682" containerID="cae180ac84c542adc5e640936e665d82183bbb27f65b7e1e59e92d435d368a52" exitCode=0 Mar 10 09:48:02 crc kubenswrapper[4883]: I0310 09:48:02.740026 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" event={"ID":"4e44dc59-cae3-44ee-87bf-2b85d5850682","Type":"ContainerDied","Data":"cae180ac84c542adc5e640936e665d82183bbb27f65b7e1e59e92d435d368a52"} Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.056920 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.251607 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8qm7\" (UniqueName: \"kubernetes.io/projected/4e44dc59-cae3-44ee-87bf-2b85d5850682-kube-api-access-j8qm7\") pod \"4e44dc59-cae3-44ee-87bf-2b85d5850682\" (UID: \"4e44dc59-cae3-44ee-87bf-2b85d5850682\") " Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.257147 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e44dc59-cae3-44ee-87bf-2b85d5850682-kube-api-access-j8qm7" (OuterVolumeSpecName: "kube-api-access-j8qm7") pod "4e44dc59-cae3-44ee-87bf-2b85d5850682" (UID: "4e44dc59-cae3-44ee-87bf-2b85d5850682"). InnerVolumeSpecName "kube-api-access-j8qm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.353993 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8qm7\" (UniqueName: \"kubernetes.io/projected/4e44dc59-cae3-44ee-87bf-2b85d5850682-kube-api-access-j8qm7\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.756861 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" event={"ID":"4e44dc59-cae3-44ee-87bf-2b85d5850682","Type":"ContainerDied","Data":"6394610a49512c2011f25bcc020f3a3af9a16665fd86bd626d17107c1489da0a"} Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.757199 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6394610a49512c2011f25bcc020f3a3af9a16665fd86bd626d17107c1489da0a" Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.756915 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:05 crc kubenswrapper[4883]: I0310 09:48:05.125064 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552262-5shjs"] Mar 10 09:48:05 crc kubenswrapper[4883]: I0310 09:48:05.138385 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552262-5shjs"] Mar 10 09:48:06 crc kubenswrapper[4883]: I0310 09:48:06.088689 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a7f73ce-4bec-451b-8fc7-a787366b6001" path="/var/lib/kubelet/pods/5a7f73ce-4bec-451b-8fc7-a787366b6001/volumes" Mar 10 09:48:06 crc kubenswrapper[4883]: I0310 09:48:06.330246 4883 scope.go:117] "RemoveContainer" containerID="cbfa56661b259829721fddd0e6afd1c57e6fb0cd59f60d7658cb1739ef3cba81" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.289821 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bjqqv"] Mar 10 09:48:45 crc kubenswrapper[4883]: E0310 09:48:45.291050 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e44dc59-cae3-44ee-87bf-2b85d5850682" containerName="oc" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.291065 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e44dc59-cae3-44ee-87bf-2b85d5850682" containerName="oc" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.291283 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e44dc59-cae3-44ee-87bf-2b85d5850682" containerName="oc" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.292955 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.304027 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bjqqv"] Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.345002 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-catalog-content\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.345118 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-utilities\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.345554 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlqkd\" (UniqueName: \"kubernetes.io/projected/42973d9a-2054-4a79-b789-8dfba272a471-kube-api-access-zlqkd\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.448109 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlqkd\" (UniqueName: \"kubernetes.io/projected/42973d9a-2054-4a79-b789-8dfba272a471-kube-api-access-zlqkd\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.448382 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-catalog-content\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.448453 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-utilities\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.449029 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-catalog-content\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.449098 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-utilities\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.468841 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlqkd\" (UniqueName: \"kubernetes.io/projected/42973d9a-2054-4a79-b789-8dfba272a471-kube-api-access-zlqkd\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.613070 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:46 crc kubenswrapper[4883]: I0310 09:48:46.127116 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bjqqv"] Mar 10 09:48:47 crc kubenswrapper[4883]: I0310 09:48:47.113626 4883 generic.go:334] "Generic (PLEG): container finished" podID="42973d9a-2054-4a79-b789-8dfba272a471" containerID="a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab" exitCode=0 Mar 10 09:48:47 crc kubenswrapper[4883]: I0310 09:48:47.113703 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerDied","Data":"a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab"} Mar 10 09:48:47 crc kubenswrapper[4883]: I0310 09:48:47.113974 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerStarted","Data":"c1efafaa06cf0a95cadd9a4584b4c6a8a5ceb29a673366e3e20b9973500195f0"} Mar 10 09:48:47 crc kubenswrapper[4883]: I0310 09:48:47.448978 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:48:47 crc kubenswrapper[4883]: I0310 09:48:47.449029 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:48:48 crc kubenswrapper[4883]: I0310 09:48:48.123289 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerStarted","Data":"f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644"} Mar 10 09:48:49 crc kubenswrapper[4883]: I0310 09:48:49.139361 4883 generic.go:334] "Generic (PLEG): container finished" podID="42973d9a-2054-4a79-b789-8dfba272a471" containerID="f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644" exitCode=0 Mar 10 09:48:49 crc kubenswrapper[4883]: I0310 09:48:49.139412 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerDied","Data":"f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644"} Mar 10 09:48:50 crc kubenswrapper[4883]: I0310 09:48:50.151495 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerStarted","Data":"1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35"} Mar 10 09:48:50 crc kubenswrapper[4883]: I0310 09:48:50.176455 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bjqqv" podStartSLOduration=2.666995884 podStartE2EDuration="5.176428908s" podCreationTimestamp="2026-03-10 09:48:45 +0000 UTC" firstStartedPulling="2026-03-10 09:48:47.116151045 +0000 UTC m=+2713.371048934" lastFinishedPulling="2026-03-10 09:48:49.62558407 +0000 UTC m=+2715.880481958" observedRunningTime="2026-03-10 09:48:50.170510889 +0000 UTC m=+2716.425408778" watchObservedRunningTime="2026-03-10 09:48:50.176428908 +0000 UTC m=+2716.431326797" Mar 10 09:48:55 crc kubenswrapper[4883]: I0310 09:48:55.613946 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:55 crc kubenswrapper[4883]: I0310 09:48:55.614593 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:55 crc kubenswrapper[4883]: I0310 09:48:55.657358 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:56 crc kubenswrapper[4883]: I0310 09:48:56.256518 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:56 crc kubenswrapper[4883]: I0310 09:48:56.327662 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bjqqv"] Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.229922 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bjqqv" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="registry-server" containerID="cri-o://1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35" gracePeriod=2 Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.662048 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.811890 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-catalog-content\") pod \"42973d9a-2054-4a79-b789-8dfba272a471\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.812364 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlqkd\" (UniqueName: \"kubernetes.io/projected/42973d9a-2054-4a79-b789-8dfba272a471-kube-api-access-zlqkd\") pod \"42973d9a-2054-4a79-b789-8dfba272a471\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.812436 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-utilities\") pod \"42973d9a-2054-4a79-b789-8dfba272a471\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.813615 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-utilities" (OuterVolumeSpecName: "utilities") pod "42973d9a-2054-4a79-b789-8dfba272a471" (UID: "42973d9a-2054-4a79-b789-8dfba272a471"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.821166 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42973d9a-2054-4a79-b789-8dfba272a471-kube-api-access-zlqkd" (OuterVolumeSpecName: "kube-api-access-zlqkd") pod "42973d9a-2054-4a79-b789-8dfba272a471" (UID: "42973d9a-2054-4a79-b789-8dfba272a471"). InnerVolumeSpecName "kube-api-access-zlqkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.856621 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42973d9a-2054-4a79-b789-8dfba272a471" (UID: "42973d9a-2054-4a79-b789-8dfba272a471"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.914918 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlqkd\" (UniqueName: \"kubernetes.io/projected/42973d9a-2054-4a79-b789-8dfba272a471-kube-api-access-zlqkd\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.914949 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.914960 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.243580 4883 generic.go:334] "Generic (PLEG): container finished" podID="42973d9a-2054-4a79-b789-8dfba272a471" containerID="1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35" exitCode=0 Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.243662 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerDied","Data":"1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35"} Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.243723 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerDied","Data":"c1efafaa06cf0a95cadd9a4584b4c6a8a5ceb29a673366e3e20b9973500195f0"} Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.243751 4883 scope.go:117] "RemoveContainer" containerID="1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.243745 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.273243 4883 scope.go:117] "RemoveContainer" containerID="f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.273265 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bjqqv"] Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.282113 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bjqqv"] Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.303098 4883 scope.go:117] "RemoveContainer" containerID="a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.323953 4883 scope.go:117] "RemoveContainer" containerID="1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35" Mar 10 09:48:59 crc kubenswrapper[4883]: E0310 09:48:59.324377 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35\": container with ID starting with 1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35 not found: ID does not exist" containerID="1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.324429 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35"} err="failed to get container status \"1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35\": rpc error: code = NotFound desc = could not find container \"1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35\": container with ID starting with 1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35 not found: ID does not exist" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.324463 4883 scope.go:117] "RemoveContainer" containerID="f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644" Mar 10 09:48:59 crc kubenswrapper[4883]: E0310 09:48:59.324754 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644\": container with ID starting with f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644 not found: ID does not exist" containerID="f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.324782 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644"} err="failed to get container status \"f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644\": rpc error: code = NotFound desc = could not find container \"f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644\": container with ID starting with f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644 not found: ID does not exist" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.324800 4883 scope.go:117] "RemoveContainer" containerID="a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab" Mar 10 09:48:59 crc kubenswrapper[4883]: E0310 09:48:59.325011 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab\": container with ID starting with a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab not found: ID does not exist" containerID="a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.325034 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab"} err="failed to get container status \"a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab\": rpc error: code = NotFound desc = could not find container \"a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab\": container with ID starting with a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab not found: ID does not exist" Mar 10 09:49:00 crc kubenswrapper[4883]: I0310 09:49:00.090526 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42973d9a-2054-4a79-b789-8dfba272a471" path="/var/lib/kubelet/pods/42973d9a-2054-4a79-b789-8dfba272a471/volumes" Mar 10 09:49:17 crc kubenswrapper[4883]: I0310 09:49:17.449043 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:49:17 crc kubenswrapper[4883]: I0310 09:49:17.449661 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.449559 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.450337 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.450413 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.451415 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42db6b75ced608253f11bc996fd1ec66cf10171b52105327ab8dabc2d2a5fbef"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.451496 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://42db6b75ced608253f11bc996fd1ec66cf10171b52105327ab8dabc2d2a5fbef" gracePeriod=600 Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.675725 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="42db6b75ced608253f11bc996fd1ec66cf10171b52105327ab8dabc2d2a5fbef" exitCode=0 Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.675785 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"42db6b75ced608253f11bc996fd1ec66cf10171b52105327ab8dabc2d2a5fbef"} Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.676028 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:49:48 crc kubenswrapper[4883]: I0310 09:49:48.688040 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f"} Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.146834 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552270-wwj5c"] Mar 10 09:50:00 crc kubenswrapper[4883]: E0310 09:50:00.148047 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="registry-server" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.148061 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="registry-server" Mar 10 09:50:00 crc kubenswrapper[4883]: E0310 09:50:00.148087 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="extract-content" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.148094 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="extract-content" Mar 10 09:50:00 crc kubenswrapper[4883]: E0310 09:50:00.148134 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="extract-utilities" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.148140 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="extract-utilities" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.148447 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="registry-server" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.149450 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.152148 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.152262 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.153327 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.158690 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552270-wwj5c"] Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.244387 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r5fg\" (UniqueName: \"kubernetes.io/projected/c376c647-e032-465a-8abc-e8ae35219822-kube-api-access-6r5fg\") pod \"auto-csr-approver-29552270-wwj5c\" (UID: \"c376c647-e032-465a-8abc-e8ae35219822\") " pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.345888 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r5fg\" (UniqueName: \"kubernetes.io/projected/c376c647-e032-465a-8abc-e8ae35219822-kube-api-access-6r5fg\") pod \"auto-csr-approver-29552270-wwj5c\" (UID: \"c376c647-e032-465a-8abc-e8ae35219822\") " pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.364598 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r5fg\" (UniqueName: \"kubernetes.io/projected/c376c647-e032-465a-8abc-e8ae35219822-kube-api-access-6r5fg\") pod \"auto-csr-approver-29552270-wwj5c\" (UID: \"c376c647-e032-465a-8abc-e8ae35219822\") " pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.473152 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.891209 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552270-wwj5c"] Mar 10 09:50:01 crc kubenswrapper[4883]: I0310 09:50:01.823107 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" event={"ID":"c376c647-e032-465a-8abc-e8ae35219822","Type":"ContainerStarted","Data":"f82360dd0b039ebdd0bd386bf762895c7a1cce10ae7224ac6e29e92d41a28d69"} Mar 10 09:50:02 crc kubenswrapper[4883]: I0310 09:50:02.835966 4883 generic.go:334] "Generic (PLEG): container finished" podID="c376c647-e032-465a-8abc-e8ae35219822" containerID="d6974f5cfcc3316424a08bec1cdff0b6d759b4edf301ad2eefc7e6b26b5cc6f9" exitCode=0 Mar 10 09:50:02 crc kubenswrapper[4883]: I0310 09:50:02.836081 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" event={"ID":"c376c647-e032-465a-8abc-e8ae35219822","Type":"ContainerDied","Data":"d6974f5cfcc3316424a08bec1cdff0b6d759b4edf301ad2eefc7e6b26b5cc6f9"} Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.190872 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.227721 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r5fg\" (UniqueName: \"kubernetes.io/projected/c376c647-e032-465a-8abc-e8ae35219822-kube-api-access-6r5fg\") pod \"c376c647-e032-465a-8abc-e8ae35219822\" (UID: \"c376c647-e032-465a-8abc-e8ae35219822\") " Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.233184 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c376c647-e032-465a-8abc-e8ae35219822-kube-api-access-6r5fg" (OuterVolumeSpecName: "kube-api-access-6r5fg") pod "c376c647-e032-465a-8abc-e8ae35219822" (UID: "c376c647-e032-465a-8abc-e8ae35219822"). InnerVolumeSpecName "kube-api-access-6r5fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.331337 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r5fg\" (UniqueName: \"kubernetes.io/projected/c376c647-e032-465a-8abc-e8ae35219822-kube-api-access-6r5fg\") on node \"crc\" DevicePath \"\"" Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.858291 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" event={"ID":"c376c647-e032-465a-8abc-e8ae35219822","Type":"ContainerDied","Data":"f82360dd0b039ebdd0bd386bf762895c7a1cce10ae7224ac6e29e92d41a28d69"} Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.858363 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.858375 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f82360dd0b039ebdd0bd386bf762895c7a1cce10ae7224ac6e29e92d41a28d69" Mar 10 09:50:05 crc kubenswrapper[4883]: I0310 09:50:05.258875 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552264-fxbzh"] Mar 10 09:50:05 crc kubenswrapper[4883]: I0310 09:50:05.265675 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552264-fxbzh"] Mar 10 09:50:06 crc kubenswrapper[4883]: I0310 09:50:06.088285 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7561a55-c8cc-4fad-99cf-6a81612efa5f" path="/var/lib/kubelet/pods/e7561a55-c8cc-4fad-99cf-6a81612efa5f/volumes" Mar 10 09:50:06 crc kubenswrapper[4883]: I0310 09:50:06.440468 4883 scope.go:117] "RemoveContainer" containerID="7a20c7b029586fbd175b90818495ae7de2932811c342c11864ee48f481c0032f" Mar 10 09:51:47 crc kubenswrapper[4883]: I0310 09:51:47.449155 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:51:47 crc kubenswrapper[4883]: I0310 09:51:47.449839 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.597933 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2zqxs"] Mar 10 09:51:51 crc kubenswrapper[4883]: E0310 09:51:51.598916 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c376c647-e032-465a-8abc-e8ae35219822" containerName="oc" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.598932 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c376c647-e032-465a-8abc-e8ae35219822" containerName="oc" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.599176 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c376c647-e032-465a-8abc-e8ae35219822" containerName="oc" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.600605 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.609834 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv88p\" (UniqueName: \"kubernetes.io/projected/91051722-2538-461e-bacc-795d4c2dd312-kube-api-access-vv88p\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.610051 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-utilities\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.610079 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-catalog-content\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.610858 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2zqxs"] Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.712017 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv88p\" (UniqueName: \"kubernetes.io/projected/91051722-2538-461e-bacc-795d4c2dd312-kube-api-access-vv88p\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.712183 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-utilities\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.712206 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-catalog-content\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.712767 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-utilities\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.712812 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-catalog-content\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.730002 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv88p\" (UniqueName: \"kubernetes.io/projected/91051722-2538-461e-bacc-795d4c2dd312-kube-api-access-vv88p\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.918884 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:52 crc kubenswrapper[4883]: I0310 09:51:52.328636 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2zqxs"] Mar 10 09:51:52 crc kubenswrapper[4883]: I0310 09:51:52.781259 4883 generic.go:334] "Generic (PLEG): container finished" podID="91051722-2538-461e-bacc-795d4c2dd312" containerID="762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481" exitCode=0 Mar 10 09:51:52 crc kubenswrapper[4883]: I0310 09:51:52.781307 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerDied","Data":"762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481"} Mar 10 09:51:52 crc kubenswrapper[4883]: I0310 09:51:52.781334 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerStarted","Data":"1616cc638f92b5334f353a113714aba9530cd72972f5516666f36f97cb2cc4cc"} Mar 10 09:51:53 crc kubenswrapper[4883]: I0310 09:51:53.795173 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerStarted","Data":"21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1"} Mar 10 09:51:54 crc kubenswrapper[4883]: E0310 09:51:54.426077 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91051722_2538_461e_bacc_795d4c2dd312.slice/crio-21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:51:56 crc kubenswrapper[4883]: I0310 09:51:56.821104 4883 generic.go:334] "Generic (PLEG): container finished" podID="91051722-2538-461e-bacc-795d4c2dd312" containerID="21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1" exitCode=0 Mar 10 09:51:56 crc kubenswrapper[4883]: I0310 09:51:56.821154 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerDied","Data":"21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1"} Mar 10 09:51:57 crc kubenswrapper[4883]: I0310 09:51:57.835996 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerStarted","Data":"0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49"} Mar 10 09:51:57 crc kubenswrapper[4883]: I0310 09:51:57.854203 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2zqxs" podStartSLOduration=2.352187888 podStartE2EDuration="6.854162674s" podCreationTimestamp="2026-03-10 09:51:51 +0000 UTC" firstStartedPulling="2026-03-10 09:51:52.783558405 +0000 UTC m=+2899.038456294" lastFinishedPulling="2026-03-10 09:51:57.285533191 +0000 UTC m=+2903.540431080" observedRunningTime="2026-03-10 09:51:57.852099894 +0000 UTC m=+2904.106997783" watchObservedRunningTime="2026-03-10 09:51:57.854162674 +0000 UTC m=+2904.109060563" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.144414 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552272-7b8fg"] Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.146548 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.150033 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.150057 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552272-7b8fg"] Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.150757 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.151506 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.186676 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngtqv\" (UniqueName: \"kubernetes.io/projected/4fd9f896-b725-4a44-825a-9fd728da26b2-kube-api-access-ngtqv\") pod \"auto-csr-approver-29552272-7b8fg\" (UID: \"4fd9f896-b725-4a44-825a-9fd728da26b2\") " pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.288319 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngtqv\" (UniqueName: \"kubernetes.io/projected/4fd9f896-b725-4a44-825a-9fd728da26b2-kube-api-access-ngtqv\") pod \"auto-csr-approver-29552272-7b8fg\" (UID: \"4fd9f896-b725-4a44-825a-9fd728da26b2\") " pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.308534 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngtqv\" (UniqueName: \"kubernetes.io/projected/4fd9f896-b725-4a44-825a-9fd728da26b2-kube-api-access-ngtqv\") pod \"auto-csr-approver-29552272-7b8fg\" (UID: \"4fd9f896-b725-4a44-825a-9fd728da26b2\") " pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.467418 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.916242 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552272-7b8fg"] Mar 10 09:52:00 crc kubenswrapper[4883]: W0310 09:52:00.917138 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fd9f896_b725_4a44_825a_9fd728da26b2.slice/crio-2f4d960dfde81a150d70c8a5478d637f7a81810b04283e80c86ca52c8b791403 WatchSource:0}: Error finding container 2f4d960dfde81a150d70c8a5478d637f7a81810b04283e80c86ca52c8b791403: Status 404 returned error can't find the container with id 2f4d960dfde81a150d70c8a5478d637f7a81810b04283e80c86ca52c8b791403 Mar 10 09:52:01 crc kubenswrapper[4883]: I0310 09:52:01.886912 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" event={"ID":"4fd9f896-b725-4a44-825a-9fd728da26b2","Type":"ContainerStarted","Data":"2f4d960dfde81a150d70c8a5478d637f7a81810b04283e80c86ca52c8b791403"} Mar 10 09:52:01 crc kubenswrapper[4883]: I0310 09:52:01.919768 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:52:01 crc kubenswrapper[4883]: I0310 09:52:01.920492 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:52:02 crc kubenswrapper[4883]: I0310 09:52:02.897387 4883 generic.go:334] "Generic (PLEG): container finished" podID="4fd9f896-b725-4a44-825a-9fd728da26b2" containerID="55da525eb21d992e868ae1c36ae9269aafbac403e9bea6b7d9b244d9b58e489c" exitCode=0 Mar 10 09:52:02 crc kubenswrapper[4883]: I0310 09:52:02.897563 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" event={"ID":"4fd9f896-b725-4a44-825a-9fd728da26b2","Type":"ContainerDied","Data":"55da525eb21d992e868ae1c36ae9269aafbac403e9bea6b7d9b244d9b58e489c"} Mar 10 09:52:02 crc kubenswrapper[4883]: I0310 09:52:02.957256 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2zqxs" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="registry-server" probeResult="failure" output=< Mar 10 09:52:02 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:52:02 crc kubenswrapper[4883]: > Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.201218 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.275953 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngtqv\" (UniqueName: \"kubernetes.io/projected/4fd9f896-b725-4a44-825a-9fd728da26b2-kube-api-access-ngtqv\") pod \"4fd9f896-b725-4a44-825a-9fd728da26b2\" (UID: \"4fd9f896-b725-4a44-825a-9fd728da26b2\") " Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.280685 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd9f896-b725-4a44-825a-9fd728da26b2-kube-api-access-ngtqv" (OuterVolumeSpecName: "kube-api-access-ngtqv") pod "4fd9f896-b725-4a44-825a-9fd728da26b2" (UID: "4fd9f896-b725-4a44-825a-9fd728da26b2"). InnerVolumeSpecName "kube-api-access-ngtqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.379609 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngtqv\" (UniqueName: \"kubernetes.io/projected/4fd9f896-b725-4a44-825a-9fd728da26b2-kube-api-access-ngtqv\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.919487 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" event={"ID":"4fd9f896-b725-4a44-825a-9fd728da26b2","Type":"ContainerDied","Data":"2f4d960dfde81a150d70c8a5478d637f7a81810b04283e80c86ca52c8b791403"} Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.919545 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4d960dfde81a150d70c8a5478d637f7a81810b04283e80c86ca52c8b791403" Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.919542 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:05 crc kubenswrapper[4883]: I0310 09:52:05.268202 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552266-685zg"] Mar 10 09:52:05 crc kubenswrapper[4883]: I0310 09:52:05.277205 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552266-685zg"] Mar 10 09:52:06 crc kubenswrapper[4883]: I0310 09:52:06.090305 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a46f17b-70fa-415b-a58a-05fabe683062" path="/var/lib/kubelet/pods/3a46f17b-70fa-415b-a58a-05fabe683062/volumes" Mar 10 09:52:06 crc kubenswrapper[4883]: I0310 09:52:06.537182 4883 scope.go:117] "RemoveContainer" containerID="8f1431de5f428e41dddc47217c2e968809dd1f4b2b6ca77bcaf70fa3ca340a9d" Mar 10 09:52:11 crc kubenswrapper[4883]: I0310 09:52:11.959921 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:52:11 crc kubenswrapper[4883]: I0310 09:52:11.999324 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:52:12 crc kubenswrapper[4883]: I0310 09:52:12.192848 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2zqxs"] Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.001738 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2zqxs" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="registry-server" containerID="cri-o://0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49" gracePeriod=2 Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.424519 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.527932 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv88p\" (UniqueName: \"kubernetes.io/projected/91051722-2538-461e-bacc-795d4c2dd312-kube-api-access-vv88p\") pod \"91051722-2538-461e-bacc-795d4c2dd312\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.528139 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-catalog-content\") pod \"91051722-2538-461e-bacc-795d4c2dd312\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.528195 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-utilities\") pod \"91051722-2538-461e-bacc-795d4c2dd312\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.529301 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-utilities" (OuterVolumeSpecName: "utilities") pod "91051722-2538-461e-bacc-795d4c2dd312" (UID: "91051722-2538-461e-bacc-795d4c2dd312"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.534067 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91051722-2538-461e-bacc-795d4c2dd312-kube-api-access-vv88p" (OuterVolumeSpecName: "kube-api-access-vv88p") pod "91051722-2538-461e-bacc-795d4c2dd312" (UID: "91051722-2538-461e-bacc-795d4c2dd312"). InnerVolumeSpecName "kube-api-access-vv88p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.624374 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91051722-2538-461e-bacc-795d4c2dd312" (UID: "91051722-2538-461e-bacc-795d4c2dd312"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.631870 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.631909 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.631927 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv88p\" (UniqueName: \"kubernetes.io/projected/91051722-2538-461e-bacc-795d4c2dd312-kube-api-access-vv88p\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.014153 4883 generic.go:334] "Generic (PLEG): container finished" podID="91051722-2538-461e-bacc-795d4c2dd312" containerID="0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49" exitCode=0 Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.014213 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerDied","Data":"0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49"} Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.014248 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.014268 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerDied","Data":"1616cc638f92b5334f353a113714aba9530cd72972f5516666f36f97cb2cc4cc"} Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.014291 4883 scope.go:117] "RemoveContainer" containerID="0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.051009 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2zqxs"] Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.051606 4883 scope.go:117] "RemoveContainer" containerID="21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.060679 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2zqxs"] Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.074569 4883 scope.go:117] "RemoveContainer" containerID="762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.091726 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91051722-2538-461e-bacc-795d4c2dd312" path="/var/lib/kubelet/pods/91051722-2538-461e-bacc-795d4c2dd312/volumes" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.105618 4883 scope.go:117] "RemoveContainer" containerID="0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49" Mar 10 09:52:14 crc kubenswrapper[4883]: E0310 09:52:14.106360 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49\": container with ID starting with 0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49 not found: ID does not exist" containerID="0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.106402 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49"} err="failed to get container status \"0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49\": rpc error: code = NotFound desc = could not find container \"0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49\": container with ID starting with 0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49 not found: ID does not exist" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.106441 4883 scope.go:117] "RemoveContainer" containerID="21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1" Mar 10 09:52:14 crc kubenswrapper[4883]: E0310 09:52:14.107066 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1\": container with ID starting with 21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1 not found: ID does not exist" containerID="21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.107112 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1"} err="failed to get container status \"21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1\": rpc error: code = NotFound desc = could not find container \"21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1\": container with ID starting with 21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1 not found: ID does not exist" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.107159 4883 scope.go:117] "RemoveContainer" containerID="762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481" Mar 10 09:52:14 crc kubenswrapper[4883]: E0310 09:52:14.107678 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481\": container with ID starting with 762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481 not found: ID does not exist" containerID="762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.107707 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481"} err="failed to get container status \"762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481\": rpc error: code = NotFound desc = could not find container \"762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481\": container with ID starting with 762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481 not found: ID does not exist" Mar 10 09:52:17 crc kubenswrapper[4883]: I0310 09:52:17.449160 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:52:17 crc kubenswrapper[4883]: I0310 09:52:17.449545 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.852541 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tlmsb"] Mar 10 09:52:32 crc kubenswrapper[4883]: E0310 09:52:32.853644 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="registry-server" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.853662 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="registry-server" Mar 10 09:52:32 crc kubenswrapper[4883]: E0310 09:52:32.853694 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd9f896-b725-4a44-825a-9fd728da26b2" containerName="oc" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.853700 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd9f896-b725-4a44-825a-9fd728da26b2" containerName="oc" Mar 10 09:52:32 crc kubenswrapper[4883]: E0310 09:52:32.853743 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="extract-content" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.853749 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="extract-content" Mar 10 09:52:32 crc kubenswrapper[4883]: E0310 09:52:32.853760 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="extract-utilities" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.853766 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="extract-utilities" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.853990 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="registry-server" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.854004 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd9f896-b725-4a44-825a-9fd728da26b2" containerName="oc" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.855462 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.860123 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tlmsb"] Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.974080 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-catalog-content\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.974544 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kknn8\" (UniqueName: \"kubernetes.io/projected/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-kube-api-access-kknn8\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.974695 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-utilities\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.077410 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kknn8\" (UniqueName: \"kubernetes.io/projected/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-kube-api-access-kknn8\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.077528 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-utilities\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.077661 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-catalog-content\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.078210 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-utilities\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.078285 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-catalog-content\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.097780 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kknn8\" (UniqueName: \"kubernetes.io/projected/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-kube-api-access-kknn8\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.173927 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.609528 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tlmsb"] Mar 10 09:52:34 crc kubenswrapper[4883]: I0310 09:52:34.187706 4883 generic.go:334] "Generic (PLEG): container finished" podID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerID="b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f" exitCode=0 Mar 10 09:52:34 crc kubenswrapper[4883]: I0310 09:52:34.187923 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlmsb" event={"ID":"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1","Type":"ContainerDied","Data":"b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f"} Mar 10 09:52:34 crc kubenswrapper[4883]: I0310 09:52:34.188737 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlmsb" event={"ID":"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1","Type":"ContainerStarted","Data":"2d644c394c26845e44dd788d0cb0b57d0709245acf92637ed359eb376459943a"} Mar 10 09:52:34 crc kubenswrapper[4883]: I0310 09:52:34.190015 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:52:35 crc kubenswrapper[4883]: I0310 09:52:35.198927 4883 generic.go:334] "Generic (PLEG): container finished" podID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerID="ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3" exitCode=0 Mar 10 09:52:35 crc kubenswrapper[4883]: I0310 09:52:35.199016 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlmsb" event={"ID":"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1","Type":"ContainerDied","Data":"ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3"} Mar 10 09:52:35 crc kubenswrapper[4883]: E0310 09:52:35.274882 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2beac4_fd7d_47e1_89c3_27c1490ee6b1.slice/crio-conmon-ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:52:36 crc kubenswrapper[4883]: I0310 09:52:36.209111 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlmsb" event={"ID":"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1","Type":"ContainerStarted","Data":"d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1"} Mar 10 09:52:36 crc kubenswrapper[4883]: I0310 09:52:36.232698 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tlmsb" podStartSLOduration=2.678836044 podStartE2EDuration="4.232679198s" podCreationTimestamp="2026-03-10 09:52:32 +0000 UTC" firstStartedPulling="2026-03-10 09:52:34.189695843 +0000 UTC m=+2940.444593742" lastFinishedPulling="2026-03-10 09:52:35.743539007 +0000 UTC m=+2941.998436896" observedRunningTime="2026-03-10 09:52:36.224790248 +0000 UTC m=+2942.479688138" watchObservedRunningTime="2026-03-10 09:52:36.232679198 +0000 UTC m=+2942.487577087" Mar 10 09:52:43 crc kubenswrapper[4883]: I0310 09:52:43.174403 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:43 crc kubenswrapper[4883]: I0310 09:52:43.174811 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:43 crc kubenswrapper[4883]: I0310 09:52:43.211334 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:43 crc kubenswrapper[4883]: I0310 09:52:43.303832 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:43 crc kubenswrapper[4883]: I0310 09:52:43.443458 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tlmsb"] Mar 10 09:52:45 crc kubenswrapper[4883]: I0310 09:52:45.286697 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tlmsb" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="registry-server" containerID="cri-o://d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1" gracePeriod=2 Mar 10 09:52:45 crc kubenswrapper[4883]: E0310 09:52:45.475161 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2beac4_fd7d_47e1_89c3_27c1490ee6b1.slice/crio-d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.289346 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.294693 4883 generic.go:334] "Generic (PLEG): container finished" podID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerID="d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1" exitCode=0 Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.294737 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlmsb" event={"ID":"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1","Type":"ContainerDied","Data":"d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1"} Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.294762 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.294782 4883 scope.go:117] "RemoveContainer" containerID="d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.294771 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlmsb" event={"ID":"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1","Type":"ContainerDied","Data":"2d644c394c26845e44dd788d0cb0b57d0709245acf92637ed359eb376459943a"} Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.313035 4883 scope.go:117] "RemoveContainer" containerID="ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.330871 4883 scope.go:117] "RemoveContainer" containerID="b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.369572 4883 scope.go:117] "RemoveContainer" containerID="d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1" Mar 10 09:52:46 crc kubenswrapper[4883]: E0310 09:52:46.371979 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1\": container with ID starting with d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1 not found: ID does not exist" containerID="d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.372030 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1"} err="failed to get container status \"d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1\": rpc error: code = NotFound desc = could not find container \"d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1\": container with ID starting with d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1 not found: ID does not exist" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.372064 4883 scope.go:117] "RemoveContainer" containerID="ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3" Mar 10 09:52:46 crc kubenswrapper[4883]: E0310 09:52:46.372813 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3\": container with ID starting with ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3 not found: ID does not exist" containerID="ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.372854 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3"} err="failed to get container status \"ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3\": rpc error: code = NotFound desc = could not find container \"ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3\": container with ID starting with ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3 not found: ID does not exist" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.372900 4883 scope.go:117] "RemoveContainer" containerID="b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f" Mar 10 09:52:46 crc kubenswrapper[4883]: E0310 09:52:46.373305 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f\": container with ID starting with b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f not found: ID does not exist" containerID="b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.373329 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f"} err="failed to get container status \"b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f\": rpc error: code = NotFound desc = could not find container \"b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f\": container with ID starting with b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f not found: ID does not exist" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.430512 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kknn8\" (UniqueName: \"kubernetes.io/projected/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-kube-api-access-kknn8\") pod \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.430580 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-catalog-content\") pod \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.430743 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-utilities\") pod \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.431614 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-utilities" (OuterVolumeSpecName: "utilities") pod "bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" (UID: "bf2beac4-fd7d-47e1-89c3-27c1490ee6b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.436041 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-kube-api-access-kknn8" (OuterVolumeSpecName: "kube-api-access-kknn8") pod "bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" (UID: "bf2beac4-fd7d-47e1-89c3-27c1490ee6b1"). InnerVolumeSpecName "kube-api-access-kknn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.474384 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" (UID: "bf2beac4-fd7d-47e1-89c3-27c1490ee6b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.533899 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.533930 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kknn8\" (UniqueName: \"kubernetes.io/projected/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-kube-api-access-kknn8\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.533943 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.623801 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tlmsb"] Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.630213 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tlmsb"] Mar 10 09:52:47 crc kubenswrapper[4883]: I0310 09:52:47.448995 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:52:47 crc kubenswrapper[4883]: I0310 09:52:47.449295 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:52:47 crc kubenswrapper[4883]: I0310 09:52:47.449376 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:52:47 crc kubenswrapper[4883]: I0310 09:52:47.449992 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:52:47 crc kubenswrapper[4883]: I0310 09:52:47.450060 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" gracePeriod=600 Mar 10 09:52:47 crc kubenswrapper[4883]: E0310 09:52:47.571359 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:52:48 crc kubenswrapper[4883]: I0310 09:52:48.090803 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" path="/var/lib/kubelet/pods/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1/volumes" Mar 10 09:52:48 crc kubenswrapper[4883]: I0310 09:52:48.313216 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" exitCode=0 Mar 10 09:52:48 crc kubenswrapper[4883]: I0310 09:52:48.313304 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f"} Mar 10 09:52:48 crc kubenswrapper[4883]: I0310 09:52:48.313464 4883 scope.go:117] "RemoveContainer" containerID="42db6b75ced608253f11bc996fd1ec66cf10171b52105327ab8dabc2d2a5fbef" Mar 10 09:52:48 crc kubenswrapper[4883]: I0310 09:52:48.314359 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:52:48 crc kubenswrapper[4883]: E0310 09:52:48.314679 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:52:59 crc kubenswrapper[4883]: I0310 09:52:59.080795 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:52:59 crc kubenswrapper[4883]: E0310 09:52:59.081765 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:53:11 crc kubenswrapper[4883]: I0310 09:53:11.080241 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:53:11 crc kubenswrapper[4883]: E0310 09:53:11.081341 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:53:23 crc kubenswrapper[4883]: I0310 09:53:23.080385 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:53:23 crc kubenswrapper[4883]: E0310 09:53:23.081343 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:53:37 crc kubenswrapper[4883]: I0310 09:53:37.079681 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:53:37 crc kubenswrapper[4883]: E0310 09:53:37.080612 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:53:52 crc kubenswrapper[4883]: I0310 09:53:52.081693 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:53:52 crc kubenswrapper[4883]: E0310 09:53:52.083003 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.141972 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552274-p9wph"] Mar 10 09:54:00 crc kubenswrapper[4883]: E0310 09:54:00.142953 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="extract-content" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.142965 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="extract-content" Mar 10 09:54:00 crc kubenswrapper[4883]: E0310 09:54:00.142980 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="registry-server" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.142988 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="registry-server" Mar 10 09:54:00 crc kubenswrapper[4883]: E0310 09:54:00.142995 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="extract-utilities" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.143001 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="extract-utilities" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.143178 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="registry-server" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.143864 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.147006 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552274-p9wph"] Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.147291 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.147346 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.147409 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.165797 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmm9\" (UniqueName: \"kubernetes.io/projected/05d7064d-bd5b-4775-ab8f-2d5780f76440-kube-api-access-7gmm9\") pod \"auto-csr-approver-29552274-p9wph\" (UID: \"05d7064d-bd5b-4775-ab8f-2d5780f76440\") " pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.266967 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmm9\" (UniqueName: \"kubernetes.io/projected/05d7064d-bd5b-4775-ab8f-2d5780f76440-kube-api-access-7gmm9\") pod \"auto-csr-approver-29552274-p9wph\" (UID: \"05d7064d-bd5b-4775-ab8f-2d5780f76440\") " pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.284373 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmm9\" (UniqueName: \"kubernetes.io/projected/05d7064d-bd5b-4775-ab8f-2d5780f76440-kube-api-access-7gmm9\") pod \"auto-csr-approver-29552274-p9wph\" (UID: \"05d7064d-bd5b-4775-ab8f-2d5780f76440\") " pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.468698 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.867644 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552274-p9wph"] Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.934028 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552274-p9wph" event={"ID":"05d7064d-bd5b-4775-ab8f-2d5780f76440","Type":"ContainerStarted","Data":"a99d8f0ac49315b672c2118f1d6221fe71f4e07f66807a02b130f45f33fa72fe"} Mar 10 09:54:02 crc kubenswrapper[4883]: I0310 09:54:02.969016 4883 generic.go:334] "Generic (PLEG): container finished" podID="05d7064d-bd5b-4775-ab8f-2d5780f76440" containerID="37b94d707e9a0d88465d35e2d3c44d0202d4cba279ab4acfb2218748019ab99d" exitCode=0 Mar 10 09:54:02 crc kubenswrapper[4883]: I0310 09:54:02.969217 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552274-p9wph" event={"ID":"05d7064d-bd5b-4775-ab8f-2d5780f76440","Type":"ContainerDied","Data":"37b94d707e9a0d88465d35e2d3c44d0202d4cba279ab4acfb2218748019ab99d"} Mar 10 09:54:04 crc kubenswrapper[4883]: I0310 09:54:04.323315 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:04 crc kubenswrapper[4883]: I0310 09:54:04.358275 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gmm9\" (UniqueName: \"kubernetes.io/projected/05d7064d-bd5b-4775-ab8f-2d5780f76440-kube-api-access-7gmm9\") pod \"05d7064d-bd5b-4775-ab8f-2d5780f76440\" (UID: \"05d7064d-bd5b-4775-ab8f-2d5780f76440\") " Mar 10 09:54:04 crc kubenswrapper[4883]: I0310 09:54:04.365492 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05d7064d-bd5b-4775-ab8f-2d5780f76440-kube-api-access-7gmm9" (OuterVolumeSpecName: "kube-api-access-7gmm9") pod "05d7064d-bd5b-4775-ab8f-2d5780f76440" (UID: "05d7064d-bd5b-4775-ab8f-2d5780f76440"). InnerVolumeSpecName "kube-api-access-7gmm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:54:04 crc kubenswrapper[4883]: I0310 09:54:04.461184 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gmm9\" (UniqueName: \"kubernetes.io/projected/05d7064d-bd5b-4775-ab8f-2d5780f76440-kube-api-access-7gmm9\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:05 crc kubenswrapper[4883]: I0310 09:54:05.002153 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552274-p9wph" event={"ID":"05d7064d-bd5b-4775-ab8f-2d5780f76440","Type":"ContainerDied","Data":"a99d8f0ac49315b672c2118f1d6221fe71f4e07f66807a02b130f45f33fa72fe"} Mar 10 09:54:05 crc kubenswrapper[4883]: I0310 09:54:05.002200 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:05 crc kubenswrapper[4883]: I0310 09:54:05.002219 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a99d8f0ac49315b672c2118f1d6221fe71f4e07f66807a02b130f45f33fa72fe" Mar 10 09:54:05 crc kubenswrapper[4883]: I0310 09:54:05.080281 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:54:05 crc kubenswrapper[4883]: E0310 09:54:05.080596 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:54:05 crc kubenswrapper[4883]: I0310 09:54:05.395151 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552268-mwcmv"] Mar 10 09:54:05 crc kubenswrapper[4883]: I0310 09:54:05.400262 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552268-mwcmv"] Mar 10 09:54:06 crc kubenswrapper[4883]: I0310 09:54:06.092545 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e44dc59-cae3-44ee-87bf-2b85d5850682" path="/var/lib/kubelet/pods/4e44dc59-cae3-44ee-87bf-2b85d5850682/volumes" Mar 10 09:54:06 crc kubenswrapper[4883]: I0310 09:54:06.643492 4883 scope.go:117] "RemoveContainer" containerID="cae180ac84c542adc5e640936e665d82183bbb27f65b7e1e59e92d435d368a52" Mar 10 09:54:18 crc kubenswrapper[4883]: I0310 09:54:18.080639 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:54:18 crc kubenswrapper[4883]: E0310 09:54:18.081461 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:54:31 crc kubenswrapper[4883]: I0310 09:54:31.079729 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:54:31 crc kubenswrapper[4883]: E0310 09:54:31.080699 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:54:45 crc kubenswrapper[4883]: I0310 09:54:45.080343 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:54:45 crc kubenswrapper[4883]: E0310 09:54:45.081267 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:54:49 crc kubenswrapper[4883]: I0310 09:54:49.354297 4883 generic.go:334] "Generic (PLEG): container finished" podID="d483d791-15b3-49e7-8095-5660a9d0fdaa" containerID="20faf1bc2dd52b1aabee2636feb1570644b5e51b82c37399b21f107d33a5382f" exitCode=0 Mar 10 09:54:49 crc kubenswrapper[4883]: I0310 09:54:49.354389 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d483d791-15b3-49e7-8095-5660a9d0fdaa","Type":"ContainerDied","Data":"20faf1bc2dd52b1aabee2636feb1570644b5e51b82c37399b21f107d33a5382f"} Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.651063 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799655 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-workdir\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799720 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ca-certs\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799755 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799785 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799816 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config-secret\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799836 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ssh-key\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799892 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-temporary\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799935 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rczsq\" (UniqueName: \"kubernetes.io/projected/d483d791-15b3-49e7-8095-5660a9d0fdaa-kube-api-access-rczsq\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799971 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-config-data\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.800383 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.800760 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-config-data" (OuterVolumeSpecName: "config-data") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.805982 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.806444 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.811512 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d483d791-15b3-49e7-8095-5660a9d0fdaa-kube-api-access-rczsq" (OuterVolumeSpecName: "kube-api-access-rczsq") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "kube-api-access-rczsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.831408 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.844347 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.845195 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.846989 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902849 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rczsq\" (UniqueName: \"kubernetes.io/projected/d483d791-15b3-49e7-8095-5660a9d0fdaa-kube-api-access-rczsq\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902882 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902893 4883 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902903 4883 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902934 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902943 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902951 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902960 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902968 4883 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.917437 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 10 09:54:51 crc kubenswrapper[4883]: I0310 09:54:51.005287 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:51 crc kubenswrapper[4883]: I0310 09:54:51.374210 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d483d791-15b3-49e7-8095-5660a9d0fdaa","Type":"ContainerDied","Data":"5e9e0098f227f9af35dc0b77276abeb28187e6a5424e5047b3daacd6cc5a8286"} Mar 10 09:54:51 crc kubenswrapper[4883]: I0310 09:54:51.374253 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e9e0098f227f9af35dc0b77276abeb28187e6a5424e5047b3daacd6cc5a8286" Mar 10 09:54:51 crc kubenswrapper[4883]: I0310 09:54:51.374516 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 09:54:59 crc kubenswrapper[4883]: I0310 09:54:59.080060 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:54:59 crc kubenswrapper[4883]: E0310 09:54:59.081163 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.247939 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 09:55:00 crc kubenswrapper[4883]: E0310 09:55:00.248386 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d483d791-15b3-49e7-8095-5660a9d0fdaa" containerName="tempest-tests-tempest-tests-runner" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.248406 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d483d791-15b3-49e7-8095-5660a9d0fdaa" containerName="tempest-tests-tempest-tests-runner" Mar 10 09:55:00 crc kubenswrapper[4883]: E0310 09:55:00.248448 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05d7064d-bd5b-4775-ab8f-2d5780f76440" containerName="oc" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.248454 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="05d7064d-bd5b-4775-ab8f-2d5780f76440" containerName="oc" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.248703 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="05d7064d-bd5b-4775-ab8f-2d5780f76440" containerName="oc" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.248724 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d483d791-15b3-49e7-8095-5660a9d0fdaa" containerName="tempest-tests-tempest-tests-runner" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.249441 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.251852 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fm4md" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.254950 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.362734 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.362855 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n85x2\" (UniqueName: \"kubernetes.io/projected/4d76dec9-afd2-4850-aacb-c8d60819fc1e-kube-api-access-n85x2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.464893 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.465120 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n85x2\" (UniqueName: \"kubernetes.io/projected/4d76dec9-afd2-4850-aacb-c8d60819fc1e-kube-api-access-n85x2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.465302 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.482189 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n85x2\" (UniqueName: \"kubernetes.io/projected/4d76dec9-afd2-4850-aacb-c8d60819fc1e-kube-api-access-n85x2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.488279 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.566881 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.962119 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 09:55:01 crc kubenswrapper[4883]: I0310 09:55:01.457538 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4d76dec9-afd2-4850-aacb-c8d60819fc1e","Type":"ContainerStarted","Data":"fb012b78fec124e385c7eee81b71f76f1eab5053ff907510140e23e830544a46"} Mar 10 09:55:02 crc kubenswrapper[4883]: I0310 09:55:02.475902 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4d76dec9-afd2-4850-aacb-c8d60819fc1e","Type":"ContainerStarted","Data":"9f3c8731c0248aa3b4b8e965278c9cc8b6078c54b8d2341e9ffa46e5dc4eafd6"} Mar 10 09:55:02 crc kubenswrapper[4883]: I0310 09:55:02.496590 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.5190385979999999 podStartE2EDuration="2.496574936s" podCreationTimestamp="2026-03-10 09:55:00 +0000 UTC" firstStartedPulling="2026-03-10 09:55:00.970066023 +0000 UTC m=+3087.224963912" lastFinishedPulling="2026-03-10 09:55:01.947602361 +0000 UTC m=+3088.202500250" observedRunningTime="2026-03-10 09:55:02.489401685 +0000 UTC m=+3088.744299575" watchObservedRunningTime="2026-03-10 09:55:02.496574936 +0000 UTC m=+3088.751472825" Mar 10 09:55:10 crc kubenswrapper[4883]: I0310 09:55:10.080831 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:55:10 crc kubenswrapper[4883]: E0310 09:55:10.081825 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.294866 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9vgtk/must-gather-bkkgz"] Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.297050 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.301744 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9vgtk"/"openshift-service-ca.crt" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.301967 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9vgtk"/"kube-root-ca.crt" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.322625 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9vgtk/must-gather-bkkgz"] Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.438627 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e940d297-b038-48e9-a4bd-777df629de28-must-gather-output\") pod \"must-gather-bkkgz\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.439201 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm952\" (UniqueName: \"kubernetes.io/projected/e940d297-b038-48e9-a4bd-777df629de28-kube-api-access-gm952\") pod \"must-gather-bkkgz\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.540391 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm952\" (UniqueName: \"kubernetes.io/projected/e940d297-b038-48e9-a4bd-777df629de28-kube-api-access-gm952\") pod \"must-gather-bkkgz\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.540531 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e940d297-b038-48e9-a4bd-777df629de28-must-gather-output\") pod \"must-gather-bkkgz\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.540986 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e940d297-b038-48e9-a4bd-777df629de28-must-gather-output\") pod \"must-gather-bkkgz\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.563218 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm952\" (UniqueName: \"kubernetes.io/projected/e940d297-b038-48e9-a4bd-777df629de28-kube-api-access-gm952\") pod \"must-gather-bkkgz\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.614254 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:21 crc kubenswrapper[4883]: I0310 09:55:21.077821 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9vgtk/must-gather-bkkgz"] Mar 10 09:55:21 crc kubenswrapper[4883]: I0310 09:55:21.684063 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" event={"ID":"e940d297-b038-48e9-a4bd-777df629de28","Type":"ContainerStarted","Data":"6595242d952bf7e089a911d59e9a0a5d5af6e08fb65750a6f303235b58a953f9"} Mar 10 09:55:23 crc kubenswrapper[4883]: I0310 09:55:23.080248 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:55:23 crc kubenswrapper[4883]: E0310 09:55:23.080887 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:55:27 crc kubenswrapper[4883]: I0310 09:55:27.736616 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" event={"ID":"e940d297-b038-48e9-a4bd-777df629de28","Type":"ContainerStarted","Data":"765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615"} Mar 10 09:55:27 crc kubenswrapper[4883]: I0310 09:55:27.736961 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" event={"ID":"e940d297-b038-48e9-a4bd-777df629de28","Type":"ContainerStarted","Data":"5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6"} Mar 10 09:55:27 crc kubenswrapper[4883]: I0310 09:55:27.759412 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" podStartSLOduration=2.262921114 podStartE2EDuration="7.759391325s" podCreationTimestamp="2026-03-10 09:55:20 +0000 UTC" firstStartedPulling="2026-03-10 09:55:21.088402145 +0000 UTC m=+3107.343300034" lastFinishedPulling="2026-03-10 09:55:26.584872356 +0000 UTC m=+3112.839770245" observedRunningTime="2026-03-10 09:55:27.756230096 +0000 UTC m=+3114.011127985" watchObservedRunningTime="2026-03-10 09:55:27.759391325 +0000 UTC m=+3114.014289214" Mar 10 09:55:29 crc kubenswrapper[4883]: I0310 09:55:29.890361 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-rz69b"] Mar 10 09:55:29 crc kubenswrapper[4883]: I0310 09:55:29.891912 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:29 crc kubenswrapper[4883]: I0310 09:55:29.893772 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9vgtk"/"default-dockercfg-7b2hr" Mar 10 09:55:29 crc kubenswrapper[4883]: I0310 09:55:29.981189 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trvfb\" (UniqueName: \"kubernetes.io/projected/7a396c68-b7c1-4eda-abc2-563cdd15fee3-kube-api-access-trvfb\") pod \"crc-debug-rz69b\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:29 crc kubenswrapper[4883]: I0310 09:55:29.981516 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a396c68-b7c1-4eda-abc2-563cdd15fee3-host\") pod \"crc-debug-rz69b\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:30 crc kubenswrapper[4883]: I0310 09:55:30.082282 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trvfb\" (UniqueName: \"kubernetes.io/projected/7a396c68-b7c1-4eda-abc2-563cdd15fee3-kube-api-access-trvfb\") pod \"crc-debug-rz69b\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:30 crc kubenswrapper[4883]: I0310 09:55:30.082341 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a396c68-b7c1-4eda-abc2-563cdd15fee3-host\") pod \"crc-debug-rz69b\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:30 crc kubenswrapper[4883]: I0310 09:55:30.082532 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a396c68-b7c1-4eda-abc2-563cdd15fee3-host\") pod \"crc-debug-rz69b\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:30 crc kubenswrapper[4883]: I0310 09:55:30.101464 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trvfb\" (UniqueName: \"kubernetes.io/projected/7a396c68-b7c1-4eda-abc2-563cdd15fee3-kube-api-access-trvfb\") pod \"crc-debug-rz69b\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:30 crc kubenswrapper[4883]: I0310 09:55:30.209578 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:30 crc kubenswrapper[4883]: W0310 09:55:30.241563 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a396c68_b7c1_4eda_abc2_563cdd15fee3.slice/crio-67677ee8b72e7307a6d63c64fb27dbf478ad12ab018dcf5dc306b6b5c263a6ce WatchSource:0}: Error finding container 67677ee8b72e7307a6d63c64fb27dbf478ad12ab018dcf5dc306b6b5c263a6ce: Status 404 returned error can't find the container with id 67677ee8b72e7307a6d63c64fb27dbf478ad12ab018dcf5dc306b6b5c263a6ce Mar 10 09:55:30 crc kubenswrapper[4883]: I0310 09:55:30.765222 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" event={"ID":"7a396c68-b7c1-4eda-abc2-563cdd15fee3","Type":"ContainerStarted","Data":"67677ee8b72e7307a6d63c64fb27dbf478ad12ab018dcf5dc306b6b5c263a6ce"} Mar 10 09:55:38 crc kubenswrapper[4883]: I0310 09:55:38.079817 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:55:38 crc kubenswrapper[4883]: E0310 09:55:38.080605 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:55:40 crc kubenswrapper[4883]: I0310 09:55:40.863751 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" event={"ID":"7a396c68-b7c1-4eda-abc2-563cdd15fee3","Type":"ContainerStarted","Data":"3d6e4b449f9ccbd94f8a72789781de51d066c6a7bae2debb32621e755120d6e1"} Mar 10 09:55:40 crc kubenswrapper[4883]: I0310 09:55:40.883804 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" podStartSLOduration=1.914598512 podStartE2EDuration="11.883784442s" podCreationTimestamp="2026-03-10 09:55:29 +0000 UTC" firstStartedPulling="2026-03-10 09:55:30.244077838 +0000 UTC m=+3116.498975727" lastFinishedPulling="2026-03-10 09:55:40.213263778 +0000 UTC m=+3126.468161657" observedRunningTime="2026-03-10 09:55:40.877090584 +0000 UTC m=+3127.131988473" watchObservedRunningTime="2026-03-10 09:55:40.883784442 +0000 UTC m=+3127.138682330" Mar 10 09:55:52 crc kubenswrapper[4883]: I0310 09:55:52.080209 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:55:52 crc kubenswrapper[4883]: E0310 09:55:52.081362 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.148622 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552276-gr77g"] Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.151055 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.153698 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.162910 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.164868 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.164909 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552276-gr77g"] Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.215594 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46zdq\" (UniqueName: \"kubernetes.io/projected/4c430570-1b7a-4e38-9a8b-f13d69c18882-kube-api-access-46zdq\") pod \"auto-csr-approver-29552276-gr77g\" (UID: \"4c430570-1b7a-4e38-9a8b-f13d69c18882\") " pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.318328 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46zdq\" (UniqueName: \"kubernetes.io/projected/4c430570-1b7a-4e38-9a8b-f13d69c18882-kube-api-access-46zdq\") pod \"auto-csr-approver-29552276-gr77g\" (UID: \"4c430570-1b7a-4e38-9a8b-f13d69c18882\") " pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.345291 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46zdq\" (UniqueName: \"kubernetes.io/projected/4c430570-1b7a-4e38-9a8b-f13d69c18882-kube-api-access-46zdq\") pod \"auto-csr-approver-29552276-gr77g\" (UID: \"4c430570-1b7a-4e38-9a8b-f13d69c18882\") " pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.472489 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.888983 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552276-gr77g"] Mar 10 09:56:01 crc kubenswrapper[4883]: I0310 09:56:01.061979 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552276-gr77g" event={"ID":"4c430570-1b7a-4e38-9a8b-f13d69c18882","Type":"ContainerStarted","Data":"3e6dfca9351496f6c9565e98895c334390a1acdd73ab5b64f2c561f681606d78"} Mar 10 09:56:03 crc kubenswrapper[4883]: I0310 09:56:03.081335 4883 generic.go:334] "Generic (PLEG): container finished" podID="4c430570-1b7a-4e38-9a8b-f13d69c18882" containerID="637bb0552a72f7592feca76119ba2d1ac02ce406f7badd427582618ad5b1a1db" exitCode=0 Mar 10 09:56:03 crc kubenswrapper[4883]: I0310 09:56:03.081463 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552276-gr77g" event={"ID":"4c430570-1b7a-4e38-9a8b-f13d69c18882","Type":"ContainerDied","Data":"637bb0552a72f7592feca76119ba2d1ac02ce406f7badd427582618ad5b1a1db"} Mar 10 09:56:04 crc kubenswrapper[4883]: I0310 09:56:04.392404 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:04 crc kubenswrapper[4883]: I0310 09:56:04.406081 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46zdq\" (UniqueName: \"kubernetes.io/projected/4c430570-1b7a-4e38-9a8b-f13d69c18882-kube-api-access-46zdq\") pod \"4c430570-1b7a-4e38-9a8b-f13d69c18882\" (UID: \"4c430570-1b7a-4e38-9a8b-f13d69c18882\") " Mar 10 09:56:04 crc kubenswrapper[4883]: I0310 09:56:04.412882 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c430570-1b7a-4e38-9a8b-f13d69c18882-kube-api-access-46zdq" (OuterVolumeSpecName: "kube-api-access-46zdq") pod "4c430570-1b7a-4e38-9a8b-f13d69c18882" (UID: "4c430570-1b7a-4e38-9a8b-f13d69c18882"). InnerVolumeSpecName "kube-api-access-46zdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:56:04 crc kubenswrapper[4883]: I0310 09:56:04.510369 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46zdq\" (UniqueName: \"kubernetes.io/projected/4c430570-1b7a-4e38-9a8b-f13d69c18882-kube-api-access-46zdq\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:05 crc kubenswrapper[4883]: I0310 09:56:05.080136 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:56:05 crc kubenswrapper[4883]: E0310 09:56:05.080452 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:56:05 crc kubenswrapper[4883]: I0310 09:56:05.097376 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552276-gr77g" event={"ID":"4c430570-1b7a-4e38-9a8b-f13d69c18882","Type":"ContainerDied","Data":"3e6dfca9351496f6c9565e98895c334390a1acdd73ab5b64f2c561f681606d78"} Mar 10 09:56:05 crc kubenswrapper[4883]: I0310 09:56:05.097411 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:05 crc kubenswrapper[4883]: I0310 09:56:05.097418 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e6dfca9351496f6c9565e98895c334390a1acdd73ab5b64f2c561f681606d78" Mar 10 09:56:05 crc kubenswrapper[4883]: I0310 09:56:05.454544 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552270-wwj5c"] Mar 10 09:56:05 crc kubenswrapper[4883]: I0310 09:56:05.464178 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552270-wwj5c"] Mar 10 09:56:06 crc kubenswrapper[4883]: I0310 09:56:06.092405 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c376c647-e032-465a-8abc-e8ae35219822" path="/var/lib/kubelet/pods/c376c647-e032-465a-8abc-e8ae35219822/volumes" Mar 10 09:56:06 crc kubenswrapper[4883]: I0310 09:56:06.717124 4883 scope.go:117] "RemoveContainer" containerID="d6974f5cfcc3316424a08bec1cdff0b6d759b4edf301ad2eefc7e6b26b5cc6f9" Mar 10 09:56:10 crc kubenswrapper[4883]: I0310 09:56:10.141887 4883 generic.go:334] "Generic (PLEG): container finished" podID="7a396c68-b7c1-4eda-abc2-563cdd15fee3" containerID="3d6e4b449f9ccbd94f8a72789781de51d066c6a7bae2debb32621e755120d6e1" exitCode=0 Mar 10 09:56:10 crc kubenswrapper[4883]: I0310 09:56:10.141979 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" event={"ID":"7a396c68-b7c1-4eda-abc2-563cdd15fee3","Type":"ContainerDied","Data":"3d6e4b449f9ccbd94f8a72789781de51d066c6a7bae2debb32621e755120d6e1"} Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.234744 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.246130 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trvfb\" (UniqueName: \"kubernetes.io/projected/7a396c68-b7c1-4eda-abc2-563cdd15fee3-kube-api-access-trvfb\") pod \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.246212 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a396c68-b7c1-4eda-abc2-563cdd15fee3-host\") pod \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.246641 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a396c68-b7c1-4eda-abc2-563cdd15fee3-host" (OuterVolumeSpecName: "host") pod "7a396c68-b7c1-4eda-abc2-563cdd15fee3" (UID: "7a396c68-b7c1-4eda-abc2-563cdd15fee3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.251165 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a396c68-b7c1-4eda-abc2-563cdd15fee3-kube-api-access-trvfb" (OuterVolumeSpecName: "kube-api-access-trvfb") pod "7a396c68-b7c1-4eda-abc2-563cdd15fee3" (UID: "7a396c68-b7c1-4eda-abc2-563cdd15fee3"). InnerVolumeSpecName "kube-api-access-trvfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.273656 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-rz69b"] Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.279674 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-rz69b"] Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.348360 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trvfb\" (UniqueName: \"kubernetes.io/projected/7a396c68-b7c1-4eda-abc2-563cdd15fee3-kube-api-access-trvfb\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.348394 4883 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a396c68-b7c1-4eda-abc2-563cdd15fee3-host\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.089146 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a396c68-b7c1-4eda-abc2-563cdd15fee3" path="/var/lib/kubelet/pods/7a396c68-b7c1-4eda-abc2-563cdd15fee3/volumes" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.162588 4883 scope.go:117] "RemoveContainer" containerID="3d6e4b449f9ccbd94f8a72789781de51d066c6a7bae2debb32621e755120d6e1" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.162733 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.614669 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-f8p8k"] Mar 10 09:56:12 crc kubenswrapper[4883]: E0310 09:56:12.615829 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c430570-1b7a-4e38-9a8b-f13d69c18882" containerName="oc" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.615851 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c430570-1b7a-4e38-9a8b-f13d69c18882" containerName="oc" Mar 10 09:56:12 crc kubenswrapper[4883]: E0310 09:56:12.615894 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a396c68-b7c1-4eda-abc2-563cdd15fee3" containerName="container-00" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.615901 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a396c68-b7c1-4eda-abc2-563cdd15fee3" containerName="container-00" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.616133 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c430570-1b7a-4e38-9a8b-f13d69c18882" containerName="oc" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.616159 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a396c68-b7c1-4eda-abc2-563cdd15fee3" containerName="container-00" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.617016 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.618888 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9vgtk"/"default-dockercfg-7b2hr" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.674026 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqgnx\" (UniqueName: \"kubernetes.io/projected/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-kube-api-access-tqgnx\") pod \"crc-debug-f8p8k\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.674113 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-host\") pod \"crc-debug-f8p8k\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.776657 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqgnx\" (UniqueName: \"kubernetes.io/projected/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-kube-api-access-tqgnx\") pod \"crc-debug-f8p8k\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.776868 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-host\") pod \"crc-debug-f8p8k\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.777023 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-host\") pod \"crc-debug-f8p8k\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.793524 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqgnx\" (UniqueName: \"kubernetes.io/projected/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-kube-api-access-tqgnx\") pod \"crc-debug-f8p8k\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.935726 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: W0310 09:56:12.963262 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8ba7892_8a88_47fc_8a18_2be6dfb4b6ff.slice/crio-56df39cb2589ae984de8d53617c01a58a1c40539c5707b2a5d02a8a6270c572d WatchSource:0}: Error finding container 56df39cb2589ae984de8d53617c01a58a1c40539c5707b2a5d02a8a6270c572d: Status 404 returned error can't find the container with id 56df39cb2589ae984de8d53617c01a58a1c40539c5707b2a5d02a8a6270c572d Mar 10 09:56:13 crc kubenswrapper[4883]: I0310 09:56:13.178498 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" event={"ID":"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff","Type":"ContainerStarted","Data":"532b2527d72aafd115f99ee9c5bf109ede9886f8114ce58f7fee9187e356c305"} Mar 10 09:56:13 crc kubenswrapper[4883]: I0310 09:56:13.178861 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" event={"ID":"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff","Type":"ContainerStarted","Data":"56df39cb2589ae984de8d53617c01a58a1c40539c5707b2a5d02a8a6270c572d"} Mar 10 09:56:13 crc kubenswrapper[4883]: I0310 09:56:13.674719 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-f8p8k"] Mar 10 09:56:13 crc kubenswrapper[4883]: I0310 09:56:13.684866 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-f8p8k"] Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.191292 4883 generic.go:334] "Generic (PLEG): container finished" podID="d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" containerID="532b2527d72aafd115f99ee9c5bf109ede9886f8114ce58f7fee9187e356c305" exitCode=0 Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.270756 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.313056 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqgnx\" (UniqueName: \"kubernetes.io/projected/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-kube-api-access-tqgnx\") pod \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.313178 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-host\") pod \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.313243 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-host" (OuterVolumeSpecName: "host") pod "d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" (UID: "d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.313900 4883 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-host\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.319631 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-kube-api-access-tqgnx" (OuterVolumeSpecName: "kube-api-access-tqgnx") pod "d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" (UID: "d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff"). InnerVolumeSpecName "kube-api-access-tqgnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.414528 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqgnx\" (UniqueName: \"kubernetes.io/projected/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-kube-api-access-tqgnx\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.846419 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-x2sbs"] Mar 10 09:56:14 crc kubenswrapper[4883]: E0310 09:56:14.846802 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" containerName="container-00" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.846817 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" containerName="container-00" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.847040 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" containerName="container-00" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.847716 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.026313 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx8wh\" (UniqueName: \"kubernetes.io/projected/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-kube-api-access-mx8wh\") pod \"crc-debug-x2sbs\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.026401 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-host\") pod \"crc-debug-x2sbs\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.128147 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-host\") pod \"crc-debug-x2sbs\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.128261 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-host\") pod \"crc-debug-x2sbs\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.129065 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx8wh\" (UniqueName: \"kubernetes.io/projected/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-kube-api-access-mx8wh\") pod \"crc-debug-x2sbs\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.144088 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx8wh\" (UniqueName: \"kubernetes.io/projected/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-kube-api-access-mx8wh\") pod \"crc-debug-x2sbs\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.161546 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: W0310 09:56:15.185566 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb30bf7c_358d_4f5b_a5e0_efdb0a9bc4b0.slice/crio-c2b98c39dccc737efc6ca2be10fbfc53714a5c1f02982eef175dcd5d09bbb73e WatchSource:0}: Error finding container c2b98c39dccc737efc6ca2be10fbfc53714a5c1f02982eef175dcd5d09bbb73e: Status 404 returned error can't find the container with id c2b98c39dccc737efc6ca2be10fbfc53714a5c1f02982eef175dcd5d09bbb73e Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.198551 4883 scope.go:117] "RemoveContainer" containerID="532b2527d72aafd115f99ee9c5bf109ede9886f8114ce58f7fee9187e356c305" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.198667 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.202402 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" event={"ID":"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0","Type":"ContainerStarted","Data":"c2b98c39dccc737efc6ca2be10fbfc53714a5c1f02982eef175dcd5d09bbb73e"} Mar 10 09:56:16 crc kubenswrapper[4883]: I0310 09:56:16.090798 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" path="/var/lib/kubelet/pods/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff/volumes" Mar 10 09:56:16 crc kubenswrapper[4883]: I0310 09:56:16.214603 4883 generic.go:334] "Generic (PLEG): container finished" podID="db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" containerID="d219caba54af1ce5a0a3fb8cf87ed0f4b6c5ef17917a57a19ad0eb0e451f6dfd" exitCode=0 Mar 10 09:56:16 crc kubenswrapper[4883]: I0310 09:56:16.214648 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" event={"ID":"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0","Type":"ContainerDied","Data":"d219caba54af1ce5a0a3fb8cf87ed0f4b6c5ef17917a57a19ad0eb0e451f6dfd"} Mar 10 09:56:16 crc kubenswrapper[4883]: I0310 09:56:16.247341 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-x2sbs"] Mar 10 09:56:16 crc kubenswrapper[4883]: I0310 09:56:16.255889 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-x2sbs"] Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.305174 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.475867 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-host\") pod \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.475935 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx8wh\" (UniqueName: \"kubernetes.io/projected/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-kube-api-access-mx8wh\") pod \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.476042 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-host" (OuterVolumeSpecName: "host") pod "db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" (UID: "db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.476991 4883 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-host\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.482905 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-kube-api-access-mx8wh" (OuterVolumeSpecName: "kube-api-access-mx8wh") pod "db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" (UID: "db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0"). InnerVolumeSpecName "kube-api-access-mx8wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.579798 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx8wh\" (UniqueName: \"kubernetes.io/projected/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-kube-api-access-mx8wh\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:18 crc kubenswrapper[4883]: I0310 09:56:18.089125 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" path="/var/lib/kubelet/pods/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0/volumes" Mar 10 09:56:18 crc kubenswrapper[4883]: I0310 09:56:18.230230 4883 scope.go:117] "RemoveContainer" containerID="d219caba54af1ce5a0a3fb8cf87ed0f4b6c5ef17917a57a19ad0eb0e451f6dfd" Mar 10 09:56:18 crc kubenswrapper[4883]: I0310 09:56:18.230584 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:19 crc kubenswrapper[4883]: I0310 09:56:19.080274 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:56:19 crc kubenswrapper[4883]: E0310 09:56:19.080613 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:56:33 crc kubenswrapper[4883]: I0310 09:56:33.080331 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:56:33 crc kubenswrapper[4883]: E0310 09:56:33.081032 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.056705 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b55764b68-l794s_17a46674-c6ec-4128-8285-d71c228d11c8/barbican-api/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.286354 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f87cd7fb6-jz6ch_dce7df3b-5f31-4732-8d27-8e06dc07824d/barbican-keystone-listener/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.369183 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b55764b68-l794s_17a46674-c6ec-4128-8285-d71c228d11c8/barbican-api-log/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.469353 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f87cd7fb6-jz6ch_dce7df3b-5f31-4732-8d27-8e06dc07824d/barbican-keystone-listener-log/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.632016 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d48955d69-pvbn8_221490bc-406a-436f-8705-66106ed6bbe0/barbican-worker-log/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.633895 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d48955d69-pvbn8_221490bc-406a-436f-8705-66106ed6bbe0/barbican-worker/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.781545 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r_de8c98db-31db-4ecd-83f2-c53d4bdd2ddd/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.857263 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/ceilometer-central-agent/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.896495 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/ceilometer-notification-agent/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.951203 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/proxy-httpd/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.059099 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/sg-core/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.132959 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2caba5f6-d05e-437e-868c-952e8adf3278/cinder-api/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.149589 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2caba5f6-d05e-437e-868c-952e8adf3278/cinder-api-log/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.309357 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a7bae0a1-9bb8-47ba-a161-764cd7406992/probe/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.319239 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a7bae0a1-9bb8-47ba-a161-764cd7406992/cinder-scheduler/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.442162 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm_07ddb6af-f2c7-46eb-aac4-fe69996caf27/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.584335 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh_269dd9c8-3d75-4892-9f75-c4fe1b9093b8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.638220 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-rx8gc_da34e0af-a084-40fb-93ea-471923c49051/init/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.837932 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-rx8gc_da34e0af-a084-40fb-93ea-471923c49051/init/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.858617 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr_2428d4e5-b48e-45ad-9bfb-711c3b1e8471/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.901116 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-rx8gc_da34e0af-a084-40fb-93ea-471923c49051/dnsmasq-dns/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.051340 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4c354fe8-851f-4cf4-bc13-e06dba0a1cc0/glance-httpd/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.054590 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4c354fe8-851f-4cf4-bc13-e06dba0a1cc0/glance-log/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.236059 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_676458e7-e4a0-4f1a-b200-0ab75faaddb4/glance-httpd/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.255814 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_676458e7-e4a0-4f1a-b200-0ab75faaddb4/glance-log/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.387872 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-555c96ddb-t7tcm_ef0598ad-c7ea-4645-b553-7d9028397156/horizon/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.529786 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9_7e9f7531-37e1-4284-94ac-cada3d2fc301/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.653454 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-555c96ddb-t7tcm_ef0598ad-c7ea-4645-b553-7d9028397156/horizon-log/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.716160 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kglh5_361b2613-f26e-45c3-aabe-9a0f115e8e10/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.946668 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_39c373dd-952a-4305-82ed-1d047c7a859f/kube-state-metrics/0.log" Mar 10 09:56:37 crc kubenswrapper[4883]: I0310 09:56:37.005376 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-744f4576f6-kglt9_c6effa97-6f88-4706-98bc-b51af01bd993/keystone-api/0.log" Mar 10 09:56:37 crc kubenswrapper[4883]: I0310 09:56:37.173261 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-thhsw_eb3b72a2-945a-4719-87c0-ffaf7eb84b52/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:37 crc kubenswrapper[4883]: I0310 09:56:37.414065 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7b5fb6fc5c-pj985_82fb8a17-1c35-415a-8a5d-478730286eb1/neutron-httpd/0.log" Mar 10 09:56:37 crc kubenswrapper[4883]: I0310 09:56:37.709002 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7b5fb6fc5c-pj985_82fb8a17-1c35-415a-8a5d-478730286eb1/neutron-api/0.log" Mar 10 09:56:37 crc kubenswrapper[4883]: I0310 09:56:37.797963 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4_d37d0afe-ad64-4616-b877-bd05deefd038/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.321169 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_14ae000e-33d5-4caa-8b61-dd1ab03b9978/nova-api-log/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.430453 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_19096ebe-3796-4e22-a477-45d3e635a80a/nova-cell0-conductor-conductor/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.460391 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_14ae000e-33d5-4caa-8b61-dd1ab03b9978/nova-api-api/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.608975 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_90b06d82-9f07-4c29-9bad-987d2c6d027c/nova-cell1-conductor-conductor/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.719732 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2c5d710c-62fb-4a8c-8a5c-ec6709017c75/nova-cell1-novncproxy-novncproxy/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.814348 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-47dxf_af134b73-8c24-4b9e-b15e-48ff4b83ecd4/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.986190 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0743bd84-b1d5-4634-9a7f-2c9daf2a5994/nova-metadata-log/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.189825 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_626b3115-ced1-45ea-8401-e2bd7e79a20c/nova-scheduler-scheduler/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.261851 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_287f174d-514a-4c8c-a70e-b6e64fe41653/mysql-bootstrap/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.470440 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_287f174d-514a-4c8c-a70e-b6e64fe41653/galera/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.527286 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_287f174d-514a-4c8c-a70e-b6e64fe41653/mysql-bootstrap/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.668307 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5dae6834-0ed6-4043-9efe-91745925591a/mysql-bootstrap/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.870841 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5dae6834-0ed6-4043-9efe-91745925591a/mysql-bootstrap/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.929711 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5dae6834-0ed6-4043-9efe-91745925591a/galera/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.960959 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0743bd84-b1d5-4634-9a7f-2c9daf2a5994/nova-metadata-metadata/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.082541 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_166b0c95-d44f-41e4-b27a-01e549dfb9d2/openstackclient/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.165239 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lb2z9_6691939e-adb0-420c-bf9e-f4a9b670c83b/ovn-controller/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.338368 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-b2z2p_570aed6d-03dc-4ad5-b0e1-c6efc4facabb/openstack-network-exporter/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.435534 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovsdb-server-init/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.584035 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovsdb-server-init/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.633315 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovs-vswitchd/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.663826 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovsdb-server/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.795413 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7cqkz_bbcde384-73a5-48c3-a5fb-226d671707cb/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.854602 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b47099e9-f945-4873-a704-ee55b0f0ac46/openstack-network-exporter/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.883586 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b47099e9-f945-4873-a704-ee55b0f0ac46/ovn-northd/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.289987 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_be383ddb-b33d-4129-acf8-1ffbbc21b1d4/openstack-network-exporter/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.293113 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_be383ddb-b33d-4129-acf8-1ffbbc21b1d4/ovsdbserver-nb/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.389236 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_249a9bf5-ef0f-4209-855e-3fa422106519/openstack-network-exporter/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.480645 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_249a9bf5-ef0f-4209-855e-3fa422106519/ovsdbserver-sb/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.529452 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5946656968-5mzlm_309c3af5-db30-48b8-8118-471950b7312c/placement-api/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.687948 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5946656968-5mzlm_309c3af5-db30-48b8-8118-471950b7312c/placement-log/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.728385 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_170c41ad-d10f-4567-97ec-2b90d149951b/setup-container/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.022468 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_170c41ad-d10f-4567-97ec-2b90d149951b/rabbitmq/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.057703 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_170c41ad-d10f-4567-97ec-2b90d149951b/setup-container/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.064658 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1be05788-71cf-486a-8142-e317e959bfe9/setup-container/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.221700 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1be05788-71cf-486a-8142-e317e959bfe9/setup-container/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.271506 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1be05788-71cf-486a-8142-e317e959bfe9/rabbitmq/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.277289 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz_0efdf39d-2133-4aaf-9fec-2b50533d3cae/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.478859 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pf4n9_d3461a81-abbe-4c3e-88ca-42eff1eeb14e/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.530663 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7_4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.690606 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rlqjc_61bb4cc5-1d4f-4439-a00e-4b2e27d4802b/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.743806 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-v5v84_caa69332-97ab-4629-900f-1596af363ba4/ssh-known-hosts-edpm-deployment/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.965925 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6656f7cc-nv5pp_ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b/proxy-server/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.993825 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6656f7cc-nv5pp_ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b/proxy-httpd/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.109755 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-n4vhh_cbe93226-96c7-4854-abdc-4afe54ad7ad5/swift-ring-rebalance/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.201787 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-auditor/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.281275 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-reaper/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.348268 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-replicator/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.380489 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-server/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.431084 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-auditor/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.538895 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-updater/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.544018 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-server/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.565451 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-replicator/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.703466 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-auditor/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.749351 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-expirer/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.756137 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-replicator/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.776319 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-server/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.892178 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-updater/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.948584 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/rsync/0.log" Mar 10 09:56:44 crc kubenswrapper[4883]: I0310 09:56:44.009233 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/swift-recon-cron/0.log" Mar 10 09:56:44 crc kubenswrapper[4883]: I0310 09:56:44.171539 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-blk56_b083d3b3-edb7-4d2f-a7b7-f1275bd83fde/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:44 crc kubenswrapper[4883]: I0310 09:56:44.201394 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d483d791-15b3-49e7-8095-5660a9d0fdaa/tempest-tests-tempest-tests-runner/0.log" Mar 10 09:56:44 crc kubenswrapper[4883]: I0310 09:56:44.399948 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4d76dec9-afd2-4850-aacb-c8d60819fc1e/test-operator-logs-container/0.log" Mar 10 09:56:44 crc kubenswrapper[4883]: I0310 09:56:44.465176 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp_20e06399-dd26-4a60-a6b7-261cc4505a92/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:46 crc kubenswrapper[4883]: I0310 09:56:46.080261 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:56:46 crc kubenswrapper[4883]: E0310 09:56:46.080697 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:56:53 crc kubenswrapper[4883]: I0310 09:56:53.274820 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_52bdcacc-ce19-418b-871c-35482038da29/memcached/0.log" Mar 10 09:57:01 crc kubenswrapper[4883]: I0310 09:57:01.080639 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:57:01 crc kubenswrapper[4883]: E0310 09:57:01.082860 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:57:07 crc kubenswrapper[4883]: I0310 09:57:07.875368 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/util/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.070097 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/util/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.075403 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/pull/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.084575 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/pull/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.213751 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/pull/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.231913 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/extract/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.233727 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/util/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.575739 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-h2cxw_9a394c48-31ca-4e99-b210-45ae6f67faaa/manager/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.851228 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-w9dbp_63474f68-d09d-4822-b650-96a37aead592/manager/0.log" Mar 10 09:57:09 crc kubenswrapper[4883]: I0310 09:57:09.058347 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-mbxnn_bf027c79-6bdb-4cfb-8c31-d785b80e2231/manager/0.log" Mar 10 09:57:09 crc kubenswrapper[4883]: I0310 09:57:09.298007 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-fvwbt_8a4cb5eb-0894-440e-8cfd-448651696a6f/manager/0.log" Mar 10 09:57:09 crc kubenswrapper[4883]: I0310 09:57:09.674180 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-nzdsk_09a04267-a914-4c55-add8-735a053038d3/manager/0.log" Mar 10 09:57:09 crc kubenswrapper[4883]: I0310 09:57:09.735817 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-txdwh_884f7bcb-08ef-49f3-912b-ca921e342615/manager/0.log" Mar 10 09:57:09 crc kubenswrapper[4883]: I0310 09:57:09.891302 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-v6p2d_c994e4ad-140c-4655-ad69-e4013406d12e/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.029036 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-v5kxw_ad93994a-26d2-4353-80be-456c1311020e/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.120349 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-dgrlb_8b177c77-d85f-4374-b6db-a700719c1282/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.386881 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-kz9sv_ec624ec4-966f-410c-95c7-73be0f9cad27/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.543132 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-snvh5_91415f40-08a2-451b-abe8-38c7b447e66f/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.677636 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-rpwdx_760c8dff-c64a-492b-a778-45ef16d197bd/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.785610 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-49gjk_d0e08342-2d1b-42d9-921e-1d948f701a58/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.990899 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6647d7885f9f2px_2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f/manager/0.log" Mar 10 09:57:11 crc kubenswrapper[4883]: I0310 09:57:11.335716 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6cf8df7788-tzrb8_31e7ec33-4b44-48ce-9f01-e483a7668dd6/operator/0.log" Mar 10 09:57:11 crc kubenswrapper[4883]: I0310 09:57:11.452745 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-c4vjl_83852eec-509b-4074-b837-4f00d1d07d05/registry-server/0.log" Mar 10 09:57:11 crc kubenswrapper[4883]: I0310 09:57:11.573750 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-qnwgj_c13f33e2-dd6a-4ca0-91e7-5489c753e273/manager/0.log" Mar 10 09:57:11 crc kubenswrapper[4883]: I0310 09:57:11.912804 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-pppd9_04b3aecb-7cfd-4042-b003-4bc8c339aff8/manager/0.log" Mar 10 09:57:12 crc kubenswrapper[4883]: I0310 09:57:12.030753 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pjjsn_475c1190-6d94-431a-943d-4e749ea87d6b/operator/0.log" Mar 10 09:57:12 crc kubenswrapper[4883]: I0310 09:57:12.180454 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-m6wph_1b429bd6-00de-4cc2-8a18-9f58897b6834/manager/0.log" Mar 10 09:57:12 crc kubenswrapper[4883]: I0310 09:57:12.358902 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-mkjnt_3f4c2998-b51a-4620-b674-60bb0817eb7d/manager/0.log" Mar 10 09:57:12 crc kubenswrapper[4883]: I0310 09:57:12.442781 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-8mpp4_d3d3c04d-7e05-4df2-85c6-394d0bde1a69/manager/0.log" Mar 10 09:57:12 crc kubenswrapper[4883]: I0310 09:57:12.683820 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-rkjsw_a7216675-a296-4faa-9dd5-d857b15ffa3c/manager/0.log" Mar 10 09:57:12 crc kubenswrapper[4883]: I0310 09:57:12.755804 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6679ddfdc7-9ntl4_969b2d39-fb99-42df-8e6e-3ded5cd292c8/manager/0.log" Mar 10 09:57:14 crc kubenswrapper[4883]: I0310 09:57:14.307311 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-q52nj_ac18771f-5f45-40d8-b275-38e2e1c48ba6/manager/0.log" Mar 10 09:57:15 crc kubenswrapper[4883]: I0310 09:57:15.080375 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:57:15 crc kubenswrapper[4883]: E0310 09:57:15.080703 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.369924 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-49cnv"] Mar 10 09:57:23 crc kubenswrapper[4883]: E0310 09:57:23.370784 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" containerName="container-00" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.370800 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" containerName="container-00" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.371014 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" containerName="container-00" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.372309 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.379125 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cnv"] Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.517070 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-catalog-content\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.517212 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnwms\" (UniqueName: \"kubernetes.io/projected/1aa44e40-18b7-44d8-9359-0b11eaa53417-kube-api-access-jnwms\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.517496 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-utilities\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.619679 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-utilities\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.619767 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-catalog-content\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.619818 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnwms\" (UniqueName: \"kubernetes.io/projected/1aa44e40-18b7-44d8-9359-0b11eaa53417-kube-api-access-jnwms\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.620151 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-utilities\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.620535 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-catalog-content\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.637160 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnwms\" (UniqueName: \"kubernetes.io/projected/1aa44e40-18b7-44d8-9359-0b11eaa53417-kube-api-access-jnwms\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.693172 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:24 crc kubenswrapper[4883]: I0310 09:57:24.138334 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cnv"] Mar 10 09:57:24 crc kubenswrapper[4883]: I0310 09:57:24.816759 4883 generic.go:334] "Generic (PLEG): container finished" podID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerID="1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a" exitCode=0 Mar 10 09:57:24 crc kubenswrapper[4883]: I0310 09:57:24.816924 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerDied","Data":"1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a"} Mar 10 09:57:24 crc kubenswrapper[4883]: I0310 09:57:24.817192 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerStarted","Data":"eef520defbaf457ad9a4386db296afc18afcda0e6337af57c3e015771d645861"} Mar 10 09:57:25 crc kubenswrapper[4883]: I0310 09:57:25.829147 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerStarted","Data":"69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f"} Mar 10 09:57:26 crc kubenswrapper[4883]: I0310 09:57:26.837846 4883 generic.go:334] "Generic (PLEG): container finished" podID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerID="69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f" exitCode=0 Mar 10 09:57:26 crc kubenswrapper[4883]: I0310 09:57:26.838022 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerDied","Data":"69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f"} Mar 10 09:57:27 crc kubenswrapper[4883]: I0310 09:57:27.852244 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerStarted","Data":"0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b"} Mar 10 09:57:27 crc kubenswrapper[4883]: I0310 09:57:27.871121 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-49cnv" podStartSLOduration=2.356227816 podStartE2EDuration="4.871103212s" podCreationTimestamp="2026-03-10 09:57:23 +0000 UTC" firstStartedPulling="2026-03-10 09:57:24.818812936 +0000 UTC m=+3231.073710825" lastFinishedPulling="2026-03-10 09:57:27.333688331 +0000 UTC m=+3233.588586221" observedRunningTime="2026-03-10 09:57:27.868138354 +0000 UTC m=+3234.123036242" watchObservedRunningTime="2026-03-10 09:57:27.871103212 +0000 UTC m=+3234.126001101" Mar 10 09:57:28 crc kubenswrapper[4883]: I0310 09:57:28.080342 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:57:28 crc kubenswrapper[4883]: E0310 09:57:28.080659 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:57:29 crc kubenswrapper[4883]: I0310 09:57:29.856154 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dlh8b_7ec510e9-f96b-44da-abec-7d49115d0c83/control-plane-machine-set-operator/0.log" Mar 10 09:57:29 crc kubenswrapper[4883]: I0310 09:57:29.990137 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7clc9_3de74a75-4aa1-46dd-ae5b-5c82b91811e5/kube-rbac-proxy/0.log" Mar 10 09:57:30 crc kubenswrapper[4883]: I0310 09:57:30.020679 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7clc9_3de74a75-4aa1-46dd-ae5b-5c82b91811e5/machine-api-operator/0.log" Mar 10 09:57:33 crc kubenswrapper[4883]: I0310 09:57:33.693668 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:33 crc kubenswrapper[4883]: I0310 09:57:33.695199 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:33 crc kubenswrapper[4883]: I0310 09:57:33.737592 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:33 crc kubenswrapper[4883]: I0310 09:57:33.937831 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:33 crc kubenswrapper[4883]: I0310 09:57:33.983720 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cnv"] Mar 10 09:57:35 crc kubenswrapper[4883]: I0310 09:57:35.917702 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-49cnv" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="registry-server" containerID="cri-o://0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b" gracePeriod=2 Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.339460 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.390013 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-utilities\") pod \"1aa44e40-18b7-44d8-9359-0b11eaa53417\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.390251 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-catalog-content\") pod \"1aa44e40-18b7-44d8-9359-0b11eaa53417\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.390294 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnwms\" (UniqueName: \"kubernetes.io/projected/1aa44e40-18b7-44d8-9359-0b11eaa53417-kube-api-access-jnwms\") pod \"1aa44e40-18b7-44d8-9359-0b11eaa53417\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.390834 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-utilities" (OuterVolumeSpecName: "utilities") pod "1aa44e40-18b7-44d8-9359-0b11eaa53417" (UID: "1aa44e40-18b7-44d8-9359-0b11eaa53417"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.391092 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.395378 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa44e40-18b7-44d8-9359-0b11eaa53417-kube-api-access-jnwms" (OuterVolumeSpecName: "kube-api-access-jnwms") pod "1aa44e40-18b7-44d8-9359-0b11eaa53417" (UID: "1aa44e40-18b7-44d8-9359-0b11eaa53417"). InnerVolumeSpecName "kube-api-access-jnwms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.407147 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1aa44e40-18b7-44d8-9359-0b11eaa53417" (UID: "1aa44e40-18b7-44d8-9359-0b11eaa53417"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.493000 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.493034 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnwms\" (UniqueName: \"kubernetes.io/projected/1aa44e40-18b7-44d8-9359-0b11eaa53417-kube-api-access-jnwms\") on node \"crc\" DevicePath \"\"" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.929459 4883 generic.go:334] "Generic (PLEG): container finished" podID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerID="0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b" exitCode=0 Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.929532 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerDied","Data":"0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b"} Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.929584 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.929611 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerDied","Data":"eef520defbaf457ad9a4386db296afc18afcda0e6337af57c3e015771d645861"} Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.929639 4883 scope.go:117] "RemoveContainer" containerID="0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.951125 4883 scope.go:117] "RemoveContainer" containerID="69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.962635 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cnv"] Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.968776 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cnv"] Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.993424 4883 scope.go:117] "RemoveContainer" containerID="1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a" Mar 10 09:57:37 crc kubenswrapper[4883]: I0310 09:57:37.015907 4883 scope.go:117] "RemoveContainer" containerID="0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b" Mar 10 09:57:37 crc kubenswrapper[4883]: E0310 09:57:37.016360 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b\": container with ID starting with 0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b not found: ID does not exist" containerID="0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b" Mar 10 09:57:37 crc kubenswrapper[4883]: I0310 09:57:37.016412 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b"} err="failed to get container status \"0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b\": rpc error: code = NotFound desc = could not find container \"0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b\": container with ID starting with 0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b not found: ID does not exist" Mar 10 09:57:37 crc kubenswrapper[4883]: I0310 09:57:37.016460 4883 scope.go:117] "RemoveContainer" containerID="69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f" Mar 10 09:57:37 crc kubenswrapper[4883]: E0310 09:57:37.016848 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f\": container with ID starting with 69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f not found: ID does not exist" containerID="69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f" Mar 10 09:57:37 crc kubenswrapper[4883]: I0310 09:57:37.016885 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f"} err="failed to get container status \"69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f\": rpc error: code = NotFound desc = could not find container \"69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f\": container with ID starting with 69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f not found: ID does not exist" Mar 10 09:57:37 crc kubenswrapper[4883]: I0310 09:57:37.016917 4883 scope.go:117] "RemoveContainer" containerID="1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a" Mar 10 09:57:37 crc kubenswrapper[4883]: E0310 09:57:37.017339 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a\": container with ID starting with 1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a not found: ID does not exist" containerID="1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a" Mar 10 09:57:37 crc kubenswrapper[4883]: I0310 09:57:37.017360 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a"} err="failed to get container status \"1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a\": rpc error: code = NotFound desc = could not find container \"1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a\": container with ID starting with 1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a not found: ID does not exist" Mar 10 09:57:38 crc kubenswrapper[4883]: I0310 09:57:38.093657 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" path="/var/lib/kubelet/pods/1aa44e40-18b7-44d8-9359-0b11eaa53417/volumes" Mar 10 09:57:39 crc kubenswrapper[4883]: I0310 09:57:39.079944 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:57:39 crc kubenswrapper[4883]: E0310 09:57:39.080243 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:57:41 crc kubenswrapper[4883]: I0310 09:57:41.190955 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-kl2rd_1c0c9250-e9df-4898-bd0e-91919353a3f6/cert-manager-controller/0.log" Mar 10 09:57:41 crc kubenswrapper[4883]: I0310 09:57:41.317734 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-n2g9x_b92cb5d0-214a-49a6-b9b7-f210fef36956/cert-manager-cainjector/0.log" Mar 10 09:57:41 crc kubenswrapper[4883]: I0310 09:57:41.348747 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-dfhh4_f33cf1b9-ce0d-41f4-8f36-1b159badc41e/cert-manager-webhook/0.log" Mar 10 09:57:51 crc kubenswrapper[4883]: I0310 09:57:51.770593 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-mr8tf_805fc4e3-bab7-415e-a190-0ceeda5bd8b7/nmstate-console-plugin/0.log" Mar 10 09:57:51 crc kubenswrapper[4883]: I0310 09:57:51.938847 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5lcxd_d9c7e9ee-a0a0-4afe-bd00-872553ca9b32/nmstate-handler/0.log" Mar 10 09:57:51 crc kubenswrapper[4883]: I0310 09:57:51.991056 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-x5lcq_291985dd-d623-46ba-9e1b-056dc17d26ed/kube-rbac-proxy/0.log" Mar 10 09:57:52 crc kubenswrapper[4883]: I0310 09:57:52.029244 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-x5lcq_291985dd-d623-46ba-9e1b-056dc17d26ed/nmstate-metrics/0.log" Mar 10 09:57:52 crc kubenswrapper[4883]: I0310 09:57:52.080593 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:57:52 crc kubenswrapper[4883]: I0310 09:57:52.144791 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-k4v4s_a776287a-5b99-4f43-8d4c-191108392859/nmstate-operator/0.log" Mar 10 09:57:52 crc kubenswrapper[4883]: I0310 09:57:52.232845 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-ccbds_10ab1e00-47a1-4f9a-a55a-131935759d8d/nmstate-webhook/0.log" Mar 10 09:57:53 crc kubenswrapper[4883]: I0310 09:57:53.078133 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"02853109fcbc3c4b53e8c3e9045a6ad761bbdd1ba2cc93d6639942e6c10801e3"} Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.164414 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552278-mmnzb"] Mar 10 09:58:00 crc kubenswrapper[4883]: E0310 09:58:00.165555 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="extract-content" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.165574 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="extract-content" Mar 10 09:58:00 crc kubenswrapper[4883]: E0310 09:58:00.165592 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="extract-utilities" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.165599 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="extract-utilities" Mar 10 09:58:00 crc kubenswrapper[4883]: E0310 09:58:00.165622 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="registry-server" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.165630 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="registry-server" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.165867 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="registry-server" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.166589 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.168392 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.169081 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.172658 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.184127 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552278-mmnzb"] Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.279221 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j5dm\" (UniqueName: \"kubernetes.io/projected/63837d10-3c84-4972-98da-7415e14f2594-kube-api-access-2j5dm\") pod \"auto-csr-approver-29552278-mmnzb\" (UID: \"63837d10-3c84-4972-98da-7415e14f2594\") " pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.381954 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j5dm\" (UniqueName: \"kubernetes.io/projected/63837d10-3c84-4972-98da-7415e14f2594-kube-api-access-2j5dm\") pod \"auto-csr-approver-29552278-mmnzb\" (UID: \"63837d10-3c84-4972-98da-7415e14f2594\") " pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.401280 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j5dm\" (UniqueName: \"kubernetes.io/projected/63837d10-3c84-4972-98da-7415e14f2594-kube-api-access-2j5dm\") pod \"auto-csr-approver-29552278-mmnzb\" (UID: \"63837d10-3c84-4972-98da-7415e14f2594\") " pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.484319 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.920023 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552278-mmnzb"] Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.924562 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:58:01 crc kubenswrapper[4883]: I0310 09:58:01.143538 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" event={"ID":"63837d10-3c84-4972-98da-7415e14f2594","Type":"ContainerStarted","Data":"ef31c9fb65c55e053a9af5e0ebcb8a4603e140babca44c4fa31676ab6ca88816"} Mar 10 09:58:03 crc kubenswrapper[4883]: I0310 09:58:03.166959 4883 generic.go:334] "Generic (PLEG): container finished" podID="63837d10-3c84-4972-98da-7415e14f2594" containerID="d248878325804477b2b46157dab9cab4990cb786e7cd390c24f00599d57f6825" exitCode=0 Mar 10 09:58:03 crc kubenswrapper[4883]: I0310 09:58:03.167042 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" event={"ID":"63837d10-3c84-4972-98da-7415e14f2594","Type":"ContainerDied","Data":"d248878325804477b2b46157dab9cab4990cb786e7cd390c24f00599d57f6825"} Mar 10 09:58:04 crc kubenswrapper[4883]: I0310 09:58:04.459851 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:04 crc kubenswrapper[4883]: I0310 09:58:04.575644 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j5dm\" (UniqueName: \"kubernetes.io/projected/63837d10-3c84-4972-98da-7415e14f2594-kube-api-access-2j5dm\") pod \"63837d10-3c84-4972-98da-7415e14f2594\" (UID: \"63837d10-3c84-4972-98da-7415e14f2594\") " Mar 10 09:58:04 crc kubenswrapper[4883]: I0310 09:58:04.592982 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63837d10-3c84-4972-98da-7415e14f2594-kube-api-access-2j5dm" (OuterVolumeSpecName: "kube-api-access-2j5dm") pod "63837d10-3c84-4972-98da-7415e14f2594" (UID: "63837d10-3c84-4972-98da-7415e14f2594"). InnerVolumeSpecName "kube-api-access-2j5dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:58:04 crc kubenswrapper[4883]: I0310 09:58:04.677869 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j5dm\" (UniqueName: \"kubernetes.io/projected/63837d10-3c84-4972-98da-7415e14f2594-kube-api-access-2j5dm\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:05 crc kubenswrapper[4883]: I0310 09:58:05.185432 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" event={"ID":"63837d10-3c84-4972-98da-7415e14f2594","Type":"ContainerDied","Data":"ef31c9fb65c55e053a9af5e0ebcb8a4603e140babca44c4fa31676ab6ca88816"} Mar 10 09:58:05 crc kubenswrapper[4883]: I0310 09:58:05.185512 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef31c9fb65c55e053a9af5e0ebcb8a4603e140babca44c4fa31676ab6ca88816" Mar 10 09:58:05 crc kubenswrapper[4883]: I0310 09:58:05.185519 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:05 crc kubenswrapper[4883]: I0310 09:58:05.532218 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552272-7b8fg"] Mar 10 09:58:05 crc kubenswrapper[4883]: I0310 09:58:05.538668 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552272-7b8fg"] Mar 10 09:58:06 crc kubenswrapper[4883]: I0310 09:58:06.088595 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fd9f896-b725-4a44-825a-9fd728da26b2" path="/var/lib/kubelet/pods/4fd9f896-b725-4a44-825a-9fd728da26b2/volumes" Mar 10 09:58:06 crc kubenswrapper[4883]: I0310 09:58:06.885894 4883 scope.go:117] "RemoveContainer" containerID="55da525eb21d992e868ae1c36ae9269aafbac403e9bea6b7d9b244d9b58e489c" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.010258 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-rtrbh_59437559-8b42-4779-8b72-17f09b50b572/kube-rbac-proxy/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.131230 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-rtrbh_59437559-8b42-4779-8b72-17f09b50b572/controller/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.251964 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.416370 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.429131 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.443088 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.456992 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.628181 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.629129 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.631081 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.664438 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.804521 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.808496 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.815619 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.819570 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/controller/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.950753 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/frr-metrics/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.966197 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/kube-rbac-proxy/0.log" Mar 10 09:58:16 crc kubenswrapper[4883]: I0310 09:58:16.027761 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/kube-rbac-proxy-frr/0.log" Mar 10 09:58:16 crc kubenswrapper[4883]: I0310 09:58:16.239233 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-shjnr_8e843a56-715a-44fc-9974-8570d49bd9a0/frr-k8s-webhook-server/0.log" Mar 10 09:58:16 crc kubenswrapper[4883]: I0310 09:58:16.250093 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/reloader/0.log" Mar 10 09:58:16 crc kubenswrapper[4883]: I0310 09:58:16.459417 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c79cc77cd-s6vgn_5804aa0d-ee19-4fb3-bd39-27c7103571d8/manager/0.log" Mar 10 09:58:16 crc kubenswrapper[4883]: I0310 09:58:16.644203 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57848ff665-prp4d_cb05036e-52f2-48ab-ba84-f89c4565a0af/webhook-server/0.log" Mar 10 09:58:16 crc kubenswrapper[4883]: I0310 09:58:16.746615 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gtqfn_6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf/kube-rbac-proxy/0.log" Mar 10 09:58:17 crc kubenswrapper[4883]: I0310 09:58:17.229044 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gtqfn_6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf/speaker/0.log" Mar 10 09:58:17 crc kubenswrapper[4883]: I0310 09:58:17.377847 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/frr/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.118740 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/util/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.261998 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/pull/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.264410 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/util/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.296467 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/pull/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.414780 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/util/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.430934 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/pull/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.436405 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/extract/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.595822 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-utilities/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.752210 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-content/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.756576 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-content/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.769683 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-utilities/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.896975 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-content/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.932408 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-utilities/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.045571 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-utilities/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.300712 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/registry-server/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.314321 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-utilities/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.317306 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-content/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.357647 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-content/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.512834 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-utilities/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.528408 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-content/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.713389 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/util/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.906654 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/pull/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.915196 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/util/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.944922 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/pull/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.960783 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/registry-server/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.128236 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/extract/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.133002 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/pull/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.145297 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/util/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.273007 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9d6jf_849aec1a-3ce6-4153-8e52-4bf0185e29e3/marketplace-operator/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.341087 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-utilities/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.468605 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-utilities/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.468692 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-content/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.491308 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-content/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.667366 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-utilities/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.674149 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-content/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.791164 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/registry-server/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.879985 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-utilities/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.982318 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-utilities/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.992975 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-content/0.log" Mar 10 09:58:31 crc kubenswrapper[4883]: I0310 09:58:31.012038 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-content/0.log" Mar 10 09:58:31 crc kubenswrapper[4883]: I0310 09:58:31.180430 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-content/0.log" Mar 10 09:58:31 crc kubenswrapper[4883]: I0310 09:58:31.188102 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-utilities/0.log" Mar 10 09:58:31 crc kubenswrapper[4883]: I0310 09:58:31.596331 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/registry-server/0.log" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.129803 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qmm8n"] Mar 10 09:59:23 crc kubenswrapper[4883]: E0310 09:59:23.131160 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63837d10-3c84-4972-98da-7415e14f2594" containerName="oc" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.131179 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="63837d10-3c84-4972-98da-7415e14f2594" containerName="oc" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.133210 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="63837d10-3c84-4972-98da-7415e14f2594" containerName="oc" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.135150 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.143866 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qmm8n"] Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.300452 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wtc4\" (UniqueName: \"kubernetes.io/projected/a1ddb85f-2071-49f6-a977-999227732efc-kube-api-access-4wtc4\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.300529 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-catalog-content\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.300586 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-utilities\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.401714 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wtc4\" (UniqueName: \"kubernetes.io/projected/a1ddb85f-2071-49f6-a977-999227732efc-kube-api-access-4wtc4\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.401757 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-catalog-content\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.401802 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-utilities\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.402278 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-catalog-content\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.402341 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-utilities\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.421378 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wtc4\" (UniqueName: \"kubernetes.io/projected/a1ddb85f-2071-49f6-a977-999227732efc-kube-api-access-4wtc4\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.469201 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.995975 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qmm8n"] Mar 10 09:59:24 crc kubenswrapper[4883]: I0310 09:59:24.902855 4883 generic.go:334] "Generic (PLEG): container finished" podID="a1ddb85f-2071-49f6-a977-999227732efc" containerID="95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174" exitCode=0 Mar 10 09:59:24 crc kubenswrapper[4883]: I0310 09:59:24.902907 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerDied","Data":"95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174"} Mar 10 09:59:24 crc kubenswrapper[4883]: I0310 09:59:24.903440 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerStarted","Data":"8c092cbe46ef3bb250fea53ef43ad5c32d25e6d4568f6eb7244f4a5cff9be4ae"} Mar 10 09:59:25 crc kubenswrapper[4883]: I0310 09:59:25.912511 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerStarted","Data":"2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154"} Mar 10 09:59:26 crc kubenswrapper[4883]: I0310 09:59:26.941170 4883 generic.go:334] "Generic (PLEG): container finished" podID="a1ddb85f-2071-49f6-a977-999227732efc" containerID="2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154" exitCode=0 Mar 10 09:59:26 crc kubenswrapper[4883]: I0310 09:59:26.941218 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerDied","Data":"2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154"} Mar 10 09:59:27 crc kubenswrapper[4883]: I0310 09:59:27.952971 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerStarted","Data":"44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9"} Mar 10 09:59:27 crc kubenswrapper[4883]: I0310 09:59:27.970895 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qmm8n" podStartSLOduration=2.376253646 podStartE2EDuration="4.970877645s" podCreationTimestamp="2026-03-10 09:59:23 +0000 UTC" firstStartedPulling="2026-03-10 09:59:24.90470951 +0000 UTC m=+3351.159607400" lastFinishedPulling="2026-03-10 09:59:27.49933351 +0000 UTC m=+3353.754231399" observedRunningTime="2026-03-10 09:59:27.965835211 +0000 UTC m=+3354.220733099" watchObservedRunningTime="2026-03-10 09:59:27.970877645 +0000 UTC m=+3354.225775534" Mar 10 09:59:33 crc kubenswrapper[4883]: I0310 09:59:33.469790 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:33 crc kubenswrapper[4883]: I0310 09:59:33.471253 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:33 crc kubenswrapper[4883]: I0310 09:59:33.520115 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:34 crc kubenswrapper[4883]: I0310 09:59:34.042548 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:34 crc kubenswrapper[4883]: I0310 09:59:34.091046 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qmm8n"] Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.023064 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qmm8n" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="registry-server" containerID="cri-o://44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9" gracePeriod=2 Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.415090 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.563923 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wtc4\" (UniqueName: \"kubernetes.io/projected/a1ddb85f-2071-49f6-a977-999227732efc-kube-api-access-4wtc4\") pod \"a1ddb85f-2071-49f6-a977-999227732efc\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.564386 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-catalog-content\") pod \"a1ddb85f-2071-49f6-a977-999227732efc\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.564440 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-utilities\") pod \"a1ddb85f-2071-49f6-a977-999227732efc\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.565959 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-utilities" (OuterVolumeSpecName: "utilities") pod "a1ddb85f-2071-49f6-a977-999227732efc" (UID: "a1ddb85f-2071-49f6-a977-999227732efc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.569115 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ddb85f-2071-49f6-a977-999227732efc-kube-api-access-4wtc4" (OuterVolumeSpecName: "kube-api-access-4wtc4") pod "a1ddb85f-2071-49f6-a977-999227732efc" (UID: "a1ddb85f-2071-49f6-a977-999227732efc"). InnerVolumeSpecName "kube-api-access-4wtc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.615757 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1ddb85f-2071-49f6-a977-999227732efc" (UID: "a1ddb85f-2071-49f6-a977-999227732efc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.666859 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wtc4\" (UniqueName: \"kubernetes.io/projected/a1ddb85f-2071-49f6-a977-999227732efc-kube-api-access-4wtc4\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.666893 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.666903 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.032938 4883 generic.go:334] "Generic (PLEG): container finished" podID="a1ddb85f-2071-49f6-a977-999227732efc" containerID="44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9" exitCode=0 Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.032989 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerDied","Data":"44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9"} Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.033019 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerDied","Data":"8c092cbe46ef3bb250fea53ef43ad5c32d25e6d4568f6eb7244f4a5cff9be4ae"} Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.033036 4883 scope.go:117] "RemoveContainer" containerID="44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.033170 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.060966 4883 scope.go:117] "RemoveContainer" containerID="2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.065158 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qmm8n"] Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.075492 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qmm8n"] Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.091980 4883 scope.go:117] "RemoveContainer" containerID="95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.119417 4883 scope.go:117] "RemoveContainer" containerID="44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9" Mar 10 09:59:37 crc kubenswrapper[4883]: E0310 09:59:37.119895 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9\": container with ID starting with 44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9 not found: ID does not exist" containerID="44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.119945 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9"} err="failed to get container status \"44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9\": rpc error: code = NotFound desc = could not find container \"44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9\": container with ID starting with 44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9 not found: ID does not exist" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.119979 4883 scope.go:117] "RemoveContainer" containerID="2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154" Mar 10 09:59:37 crc kubenswrapper[4883]: E0310 09:59:37.120511 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154\": container with ID starting with 2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154 not found: ID does not exist" containerID="2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.120587 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154"} err="failed to get container status \"2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154\": rpc error: code = NotFound desc = could not find container \"2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154\": container with ID starting with 2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154 not found: ID does not exist" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.120638 4883 scope.go:117] "RemoveContainer" containerID="95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174" Mar 10 09:59:37 crc kubenswrapper[4883]: E0310 09:59:37.121073 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174\": container with ID starting with 95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174 not found: ID does not exist" containerID="95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.121110 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174"} err="failed to get container status \"95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174\": rpc error: code = NotFound desc = could not find container \"95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174\": container with ID starting with 95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174 not found: ID does not exist" Mar 10 09:59:38 crc kubenswrapper[4883]: I0310 09:59:38.093298 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ddb85f-2071-49f6-a977-999227732efc" path="/var/lib/kubelet/pods/a1ddb85f-2071-49f6-a977-999227732efc/volumes" Mar 10 09:59:59 crc kubenswrapper[4883]: I0310 09:59:59.231082 4883 generic.go:334] "Generic (PLEG): container finished" podID="e940d297-b038-48e9-a4bd-777df629de28" containerID="5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6" exitCode=0 Mar 10 09:59:59 crc kubenswrapper[4883]: I0310 09:59:59.231148 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" event={"ID":"e940d297-b038-48e9-a4bd-777df629de28","Type":"ContainerDied","Data":"5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6"} Mar 10 09:59:59 crc kubenswrapper[4883]: I0310 09:59:59.232590 4883 scope.go:117] "RemoveContainer" containerID="5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6" Mar 10 09:59:59 crc kubenswrapper[4883]: I0310 09:59:59.334804 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9vgtk_must-gather-bkkgz_e940d297-b038-48e9-a4bd-777df629de28/gather/0.log" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.142512 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552280-8d8wl"] Mar 10 10:00:00 crc kubenswrapper[4883]: E0310 10:00:00.143448 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="extract-utilities" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.143572 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="extract-utilities" Mar 10 10:00:00 crc kubenswrapper[4883]: E0310 10:00:00.143654 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="extract-content" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.143709 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="extract-content" Mar 10 10:00:00 crc kubenswrapper[4883]: E0310 10:00:00.143799 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="registry-server" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.143847 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="registry-server" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.144118 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="registry-server" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.144892 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.146927 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.147220 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.147309 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.152200 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4"] Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.153201 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.154505 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.154712 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.162334 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552280-8d8wl"] Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.173418 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4"] Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.217553 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j2qf\" (UniqueName: \"kubernetes.io/projected/fa19fde8-da7f-4160-8ac1-79860fb75e66-kube-api-access-4j2qf\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.217588 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnwf8\" (UniqueName: \"kubernetes.io/projected/f7b2dc78-bc43-4cf8-a946-509772bb2522-kube-api-access-jnwf8\") pod \"auto-csr-approver-29552280-8d8wl\" (UID: \"f7b2dc78-bc43-4cf8-a946-509772bb2522\") " pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.217712 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa19fde8-da7f-4160-8ac1-79860fb75e66-config-volume\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.217736 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa19fde8-da7f-4160-8ac1-79860fb75e66-secret-volume\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.318973 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnwf8\" (UniqueName: \"kubernetes.io/projected/f7b2dc78-bc43-4cf8-a946-509772bb2522-kube-api-access-jnwf8\") pod \"auto-csr-approver-29552280-8d8wl\" (UID: \"f7b2dc78-bc43-4cf8-a946-509772bb2522\") " pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.319122 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa19fde8-da7f-4160-8ac1-79860fb75e66-config-volume\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.319169 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa19fde8-da7f-4160-8ac1-79860fb75e66-secret-volume\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.319273 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j2qf\" (UniqueName: \"kubernetes.io/projected/fa19fde8-da7f-4160-8ac1-79860fb75e66-kube-api-access-4j2qf\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.320254 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa19fde8-da7f-4160-8ac1-79860fb75e66-config-volume\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.339538 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa19fde8-da7f-4160-8ac1-79860fb75e66-secret-volume\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.342039 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnwf8\" (UniqueName: \"kubernetes.io/projected/f7b2dc78-bc43-4cf8-a946-509772bb2522-kube-api-access-jnwf8\") pod \"auto-csr-approver-29552280-8d8wl\" (UID: \"f7b2dc78-bc43-4cf8-a946-509772bb2522\") " pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.342936 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j2qf\" (UniqueName: \"kubernetes.io/projected/fa19fde8-da7f-4160-8ac1-79860fb75e66-kube-api-access-4j2qf\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.469884 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.480323 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.940542 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4"] Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.948866 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552280-8d8wl"] Mar 10 10:00:01 crc kubenswrapper[4883]: I0310 10:00:01.250046 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" event={"ID":"fa19fde8-da7f-4160-8ac1-79860fb75e66","Type":"ContainerStarted","Data":"d68b6023f7937a4e68bf3efce21061e98f0d818f1ee1d42d9743a6027a1ab521"} Mar 10 10:00:01 crc kubenswrapper[4883]: I0310 10:00:01.250117 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" event={"ID":"fa19fde8-da7f-4160-8ac1-79860fb75e66","Type":"ContainerStarted","Data":"f09ebad2386e0c657de76dfbc8e429c470df668ddc5e2663f504ce9691501d95"} Mar 10 10:00:01 crc kubenswrapper[4883]: I0310 10:00:01.251396 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" event={"ID":"f7b2dc78-bc43-4cf8-a946-509772bb2522","Type":"ContainerStarted","Data":"fb8b19a18b53bfac4eac10e0cd5189c015cb4d934110a7e3faf39d5f432028c6"} Mar 10 10:00:02 crc kubenswrapper[4883]: I0310 10:00:02.262437 4883 generic.go:334] "Generic (PLEG): container finished" podID="fa19fde8-da7f-4160-8ac1-79860fb75e66" containerID="d68b6023f7937a4e68bf3efce21061e98f0d818f1ee1d42d9743a6027a1ab521" exitCode=0 Mar 10 10:00:02 crc kubenswrapper[4883]: I0310 10:00:02.262532 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" event={"ID":"fa19fde8-da7f-4160-8ac1-79860fb75e66","Type":"ContainerDied","Data":"d68b6023f7937a4e68bf3efce21061e98f0d818f1ee1d42d9743a6027a1ab521"} Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.578014 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.692438 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j2qf\" (UniqueName: \"kubernetes.io/projected/fa19fde8-da7f-4160-8ac1-79860fb75e66-kube-api-access-4j2qf\") pod \"fa19fde8-da7f-4160-8ac1-79860fb75e66\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.692522 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa19fde8-da7f-4160-8ac1-79860fb75e66-secret-volume\") pod \"fa19fde8-da7f-4160-8ac1-79860fb75e66\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.692626 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa19fde8-da7f-4160-8ac1-79860fb75e66-config-volume\") pod \"fa19fde8-da7f-4160-8ac1-79860fb75e66\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.693510 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa19fde8-da7f-4160-8ac1-79860fb75e66-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa19fde8-da7f-4160-8ac1-79860fb75e66" (UID: "fa19fde8-da7f-4160-8ac1-79860fb75e66"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.698768 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa19fde8-da7f-4160-8ac1-79860fb75e66-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa19fde8-da7f-4160-8ac1-79860fb75e66" (UID: "fa19fde8-da7f-4160-8ac1-79860fb75e66"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.698983 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa19fde8-da7f-4160-8ac1-79860fb75e66-kube-api-access-4j2qf" (OuterVolumeSpecName: "kube-api-access-4j2qf") pod "fa19fde8-da7f-4160-8ac1-79860fb75e66" (UID: "fa19fde8-da7f-4160-8ac1-79860fb75e66"). InnerVolumeSpecName "kube-api-access-4j2qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.794531 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j2qf\" (UniqueName: \"kubernetes.io/projected/fa19fde8-da7f-4160-8ac1-79860fb75e66-kube-api-access-4j2qf\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.794554 4883 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa19fde8-da7f-4160-8ac1-79860fb75e66-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.794564 4883 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa19fde8-da7f-4160-8ac1-79860fb75e66-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:04 crc kubenswrapper[4883]: I0310 10:00:04.282987 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" event={"ID":"fa19fde8-da7f-4160-8ac1-79860fb75e66","Type":"ContainerDied","Data":"f09ebad2386e0c657de76dfbc8e429c470df668ddc5e2663f504ce9691501d95"} Mar 10 10:00:04 crc kubenswrapper[4883]: I0310 10:00:04.283033 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f09ebad2386e0c657de76dfbc8e429c470df668ddc5e2663f504ce9691501d95" Mar 10 10:00:04 crc kubenswrapper[4883]: I0310 10:00:04.283049 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:04 crc kubenswrapper[4883]: I0310 10:00:04.342159 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67"] Mar 10 10:00:04 crc kubenswrapper[4883]: I0310 10:00:04.348096 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67"] Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.092364 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e16ab2a6-c8ca-4487-b42f-381f61d18ba0" path="/var/lib/kubelet/pods/e16ab2a6-c8ca-4487-b42f-381f61d18ba0/volumes" Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.287551 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9vgtk/must-gather-bkkgz"] Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.287848 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="copy" containerID="cri-o://765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615" gracePeriod=2 Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.297336 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9vgtk/must-gather-bkkgz"] Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.683614 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9vgtk_must-gather-bkkgz_e940d297-b038-48e9-a4bd-777df629de28/copy/0.log" Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.684081 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.855413 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm952\" (UniqueName: \"kubernetes.io/projected/e940d297-b038-48e9-a4bd-777df629de28-kube-api-access-gm952\") pod \"e940d297-b038-48e9-a4bd-777df629de28\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.855747 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e940d297-b038-48e9-a4bd-777df629de28-must-gather-output\") pod \"e940d297-b038-48e9-a4bd-777df629de28\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.860368 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e940d297-b038-48e9-a4bd-777df629de28-kube-api-access-gm952" (OuterVolumeSpecName: "kube-api-access-gm952") pod "e940d297-b038-48e9-a4bd-777df629de28" (UID: "e940d297-b038-48e9-a4bd-777df629de28"). InnerVolumeSpecName "kube-api-access-gm952". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.958659 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm952\" (UniqueName: \"kubernetes.io/projected/e940d297-b038-48e9-a4bd-777df629de28-kube-api-access-gm952\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.982706 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e940d297-b038-48e9-a4bd-777df629de28-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e940d297-b038-48e9-a4bd-777df629de28" (UID: "e940d297-b038-48e9-a4bd-777df629de28"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.043315 4883 scope.go:117] "RemoveContainer" containerID="1cb9093a5dc1551f7fb85ef25abe36d1ab423453387c5dcc49644004e7492e56" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.061564 4883 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e940d297-b038-48e9-a4bd-777df629de28-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.314539 4883 generic.go:334] "Generic (PLEG): container finished" podID="e940d297-b038-48e9-a4bd-777df629de28" containerID="765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615" exitCode=143 Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.314650 4883 scope.go:117] "RemoveContainer" containerID="765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.314643 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.339628 4883 scope.go:117] "RemoveContainer" containerID="5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.409697 4883 scope.go:117] "RemoveContainer" containerID="765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615" Mar 10 10:00:07 crc kubenswrapper[4883]: E0310 10:00:07.410145 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615\": container with ID starting with 765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615 not found: ID does not exist" containerID="765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.410178 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615"} err="failed to get container status \"765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615\": rpc error: code = NotFound desc = could not find container \"765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615\": container with ID starting with 765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615 not found: ID does not exist" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.410201 4883 scope.go:117] "RemoveContainer" containerID="5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6" Mar 10 10:00:07 crc kubenswrapper[4883]: E0310 10:00:07.410606 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6\": container with ID starting with 5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6 not found: ID does not exist" containerID="5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.410652 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6"} err="failed to get container status \"5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6\": rpc error: code = NotFound desc = could not find container \"5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6\": container with ID starting with 5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6 not found: ID does not exist" Mar 10 10:00:08 crc kubenswrapper[4883]: I0310 10:00:08.095972 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e940d297-b038-48e9-a4bd-777df629de28" path="/var/lib/kubelet/pods/e940d297-b038-48e9-a4bd-777df629de28/volumes" Mar 10 10:00:15 crc kubenswrapper[4883]: I0310 10:00:15.407668 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" event={"ID":"f7b2dc78-bc43-4cf8-a946-509772bb2522","Type":"ContainerStarted","Data":"b6eef7765d3e9379f71ea8bde62b6fb0aea46e71cd54c2a9f6788a8b9c14680a"} Mar 10 10:00:15 crc kubenswrapper[4883]: I0310 10:00:15.431281 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" podStartSLOduration=1.20029418 podStartE2EDuration="15.43126581s" podCreationTimestamp="2026-03-10 10:00:00 +0000 UTC" firstStartedPulling="2026-03-10 10:00:00.945128374 +0000 UTC m=+3387.200026263" lastFinishedPulling="2026-03-10 10:00:15.176100004 +0000 UTC m=+3401.430997893" observedRunningTime="2026-03-10 10:00:15.421283455 +0000 UTC m=+3401.676181345" watchObservedRunningTime="2026-03-10 10:00:15.43126581 +0000 UTC m=+3401.686163699" Mar 10 10:00:16 crc kubenswrapper[4883]: I0310 10:00:16.419994 4883 generic.go:334] "Generic (PLEG): container finished" podID="f7b2dc78-bc43-4cf8-a946-509772bb2522" containerID="b6eef7765d3e9379f71ea8bde62b6fb0aea46e71cd54c2a9f6788a8b9c14680a" exitCode=0 Mar 10 10:00:16 crc kubenswrapper[4883]: I0310 10:00:16.420067 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" event={"ID":"f7b2dc78-bc43-4cf8-a946-509772bb2522","Type":"ContainerDied","Data":"b6eef7765d3e9379f71ea8bde62b6fb0aea46e71cd54c2a9f6788a8b9c14680a"} Mar 10 10:00:17 crc kubenswrapper[4883]: I0310 10:00:17.449320 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:00:17 crc kubenswrapper[4883]: I0310 10:00:17.449795 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:00:17 crc kubenswrapper[4883]: I0310 10:00:17.758834 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:17 crc kubenswrapper[4883]: I0310 10:00:17.892742 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnwf8\" (UniqueName: \"kubernetes.io/projected/f7b2dc78-bc43-4cf8-a946-509772bb2522-kube-api-access-jnwf8\") pod \"f7b2dc78-bc43-4cf8-a946-509772bb2522\" (UID: \"f7b2dc78-bc43-4cf8-a946-509772bb2522\") " Mar 10 10:00:17 crc kubenswrapper[4883]: I0310 10:00:17.900510 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b2dc78-bc43-4cf8-a946-509772bb2522-kube-api-access-jnwf8" (OuterVolumeSpecName: "kube-api-access-jnwf8") pod "f7b2dc78-bc43-4cf8-a946-509772bb2522" (UID: "f7b2dc78-bc43-4cf8-a946-509772bb2522"). InnerVolumeSpecName "kube-api-access-jnwf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:00:17 crc kubenswrapper[4883]: I0310 10:00:17.996828 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnwf8\" (UniqueName: \"kubernetes.io/projected/f7b2dc78-bc43-4cf8-a946-509772bb2522-kube-api-access-jnwf8\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:18 crc kubenswrapper[4883]: I0310 10:00:18.446826 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" event={"ID":"f7b2dc78-bc43-4cf8-a946-509772bb2522","Type":"ContainerDied","Data":"fb8b19a18b53bfac4eac10e0cd5189c015cb4d934110a7e3faf39d5f432028c6"} Mar 10 10:00:18 crc kubenswrapper[4883]: I0310 10:00:18.446908 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb8b19a18b53bfac4eac10e0cd5189c015cb4d934110a7e3faf39d5f432028c6" Mar 10 10:00:18 crc kubenswrapper[4883]: I0310 10:00:18.446907 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:18 crc kubenswrapper[4883]: I0310 10:00:18.487102 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552274-p9wph"] Mar 10 10:00:18 crc kubenswrapper[4883]: I0310 10:00:18.497518 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552274-p9wph"] Mar 10 10:00:20 crc kubenswrapper[4883]: I0310 10:00:20.089794 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05d7064d-bd5b-4775-ab8f-2d5780f76440" path="/var/lib/kubelet/pods/05d7064d-bd5b-4775-ab8f-2d5780f76440/volumes" Mar 10 10:00:47 crc kubenswrapper[4883]: I0310 10:00:47.449096 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:00:47 crc kubenswrapper[4883]: I0310 10:00:47.449709 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.148591 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29552281-kbhqs"] Mar 10 10:01:00 crc kubenswrapper[4883]: E0310 10:01:00.149438 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b2dc78-bc43-4cf8-a946-509772bb2522" containerName="oc" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149451 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b2dc78-bc43-4cf8-a946-509772bb2522" containerName="oc" Mar 10 10:01:00 crc kubenswrapper[4883]: E0310 10:01:00.149463 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="gather" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149489 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="gather" Mar 10 10:01:00 crc kubenswrapper[4883]: E0310 10:01:00.149503 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="copy" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149509 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="copy" Mar 10 10:01:00 crc kubenswrapper[4883]: E0310 10:01:00.149535 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa19fde8-da7f-4160-8ac1-79860fb75e66" containerName="collect-profiles" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149540 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa19fde8-da7f-4160-8ac1-79860fb75e66" containerName="collect-profiles" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149704 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa19fde8-da7f-4160-8ac1-79860fb75e66" containerName="collect-profiles" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149712 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="copy" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149728 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="gather" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149742 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b2dc78-bc43-4cf8-a946-509772bb2522" containerName="oc" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.150269 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.160291 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552281-kbhqs"] Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.258961 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-config-data\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.259544 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsmtf\" (UniqueName: \"kubernetes.io/projected/845ef92d-2dae-49c8-823f-9e3fe2735d79-kube-api-access-fsmtf\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.259726 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-combined-ca-bundle\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.259831 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-fernet-keys\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.362544 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-combined-ca-bundle\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.362667 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-fernet-keys\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.362850 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-config-data\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.362944 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsmtf\" (UniqueName: \"kubernetes.io/projected/845ef92d-2dae-49c8-823f-9e3fe2735d79-kube-api-access-fsmtf\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.368343 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-combined-ca-bundle\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.368940 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-config-data\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.369844 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-fernet-keys\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.377452 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsmtf\" (UniqueName: \"kubernetes.io/projected/845ef92d-2dae-49c8-823f-9e3fe2735d79-kube-api-access-fsmtf\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.466098 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.955517 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552281-kbhqs"] Mar 10 10:01:01 crc kubenswrapper[4883]: I0310 10:01:01.828303 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552281-kbhqs" event={"ID":"845ef92d-2dae-49c8-823f-9e3fe2735d79","Type":"ContainerStarted","Data":"fef295dc078c82d1da33ddf95186c154f12a13384b7e5473baf8c10544a1d019"} Mar 10 10:01:01 crc kubenswrapper[4883]: I0310 10:01:01.828963 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552281-kbhqs" event={"ID":"845ef92d-2dae-49c8-823f-9e3fe2735d79","Type":"ContainerStarted","Data":"10bfdfe3a827e673cb657120f380660eb88dc656044d982f500660c075167ae0"} Mar 10 10:01:01 crc kubenswrapper[4883]: I0310 10:01:01.852681 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29552281-kbhqs" podStartSLOduration=1.8526607130000001 podStartE2EDuration="1.852660713s" podCreationTimestamp="2026-03-10 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:01:01.850869327 +0000 UTC m=+3448.105767216" watchObservedRunningTime="2026-03-10 10:01:01.852660713 +0000 UTC m=+3448.107558602" Mar 10 10:01:03 crc kubenswrapper[4883]: I0310 10:01:03.852417 4883 generic.go:334] "Generic (PLEG): container finished" podID="845ef92d-2dae-49c8-823f-9e3fe2735d79" containerID="fef295dc078c82d1da33ddf95186c154f12a13384b7e5473baf8c10544a1d019" exitCode=0 Mar 10 10:01:03 crc kubenswrapper[4883]: I0310 10:01:03.852771 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552281-kbhqs" event={"ID":"845ef92d-2dae-49c8-823f-9e3fe2735d79","Type":"ContainerDied","Data":"fef295dc078c82d1da33ddf95186c154f12a13384b7e5473baf8c10544a1d019"} Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.181703 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.267817 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsmtf\" (UniqueName: \"kubernetes.io/projected/845ef92d-2dae-49c8-823f-9e3fe2735d79-kube-api-access-fsmtf\") pod \"845ef92d-2dae-49c8-823f-9e3fe2735d79\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.267955 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-fernet-keys\") pod \"845ef92d-2dae-49c8-823f-9e3fe2735d79\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.268142 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-config-data\") pod \"845ef92d-2dae-49c8-823f-9e3fe2735d79\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.268217 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-combined-ca-bundle\") pod \"845ef92d-2dae-49c8-823f-9e3fe2735d79\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.274050 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/845ef92d-2dae-49c8-823f-9e3fe2735d79-kube-api-access-fsmtf" (OuterVolumeSpecName: "kube-api-access-fsmtf") pod "845ef92d-2dae-49c8-823f-9e3fe2735d79" (UID: "845ef92d-2dae-49c8-823f-9e3fe2735d79"). InnerVolumeSpecName "kube-api-access-fsmtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.276272 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "845ef92d-2dae-49c8-823f-9e3fe2735d79" (UID: "845ef92d-2dae-49c8-823f-9e3fe2735d79"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.292670 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "845ef92d-2dae-49c8-823f-9e3fe2735d79" (UID: "845ef92d-2dae-49c8-823f-9e3fe2735d79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.307406 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-config-data" (OuterVolumeSpecName: "config-data") pod "845ef92d-2dae-49c8-823f-9e3fe2735d79" (UID: "845ef92d-2dae-49c8-823f-9e3fe2735d79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.370627 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsmtf\" (UniqueName: \"kubernetes.io/projected/845ef92d-2dae-49c8-823f-9e3fe2735d79-kube-api-access-fsmtf\") on node \"crc\" DevicePath \"\"" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.370663 4883 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.370674 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.370683 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.873387 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552281-kbhqs" event={"ID":"845ef92d-2dae-49c8-823f-9e3fe2735d79","Type":"ContainerDied","Data":"10bfdfe3a827e673cb657120f380660eb88dc656044d982f500660c075167ae0"} Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.873448 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10bfdfe3a827e673cb657120f380660eb88dc656044d982f500660c075167ae0" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.873628 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:07 crc kubenswrapper[4883]: I0310 10:01:07.104861 4883 scope.go:117] "RemoveContainer" containerID="37b94d707e9a0d88465d35e2d3c44d0202d4cba279ab4acfb2218748019ab99d" Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.449070 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.449829 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.449894 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.451081 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"02853109fcbc3c4b53e8c3e9045a6ad761bbdd1ba2cc93d6639942e6c10801e3"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.451159 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://02853109fcbc3c4b53e8c3e9045a6ad761bbdd1ba2cc93d6639942e6c10801e3" gracePeriod=600 Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.975311 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="02853109fcbc3c4b53e8c3e9045a6ad761bbdd1ba2cc93d6639942e6c10801e3" exitCode=0 Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.975391 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"02853109fcbc3c4b53e8c3e9045a6ad761bbdd1ba2cc93d6639942e6c10801e3"} Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.975721 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24"} Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.975752 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.143324 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552282-pdd7d"] Mar 10 10:02:00 crc kubenswrapper[4883]: E0310 10:02:00.144268 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="845ef92d-2dae-49c8-823f-9e3fe2735d79" containerName="keystone-cron" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.144283 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="845ef92d-2dae-49c8-823f-9e3fe2735d79" containerName="keystone-cron" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.144510 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="845ef92d-2dae-49c8-823f-9e3fe2735d79" containerName="keystone-cron" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.145126 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.146522 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.147345 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.147542 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.150549 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552282-pdd7d"] Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.320823 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/38ca7076-03d6-4598-a451-cf485909b9fc-kube-api-access-wjvdl\") pod \"auto-csr-approver-29552282-pdd7d\" (UID: \"38ca7076-03d6-4598-a451-cf485909b9fc\") " pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.423788 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/38ca7076-03d6-4598-a451-cf485909b9fc-kube-api-access-wjvdl\") pod \"auto-csr-approver-29552282-pdd7d\" (UID: \"38ca7076-03d6-4598-a451-cf485909b9fc\") " pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.444236 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/38ca7076-03d6-4598-a451-cf485909b9fc-kube-api-access-wjvdl\") pod \"auto-csr-approver-29552282-pdd7d\" (UID: \"38ca7076-03d6-4598-a451-cf485909b9fc\") " pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.460368 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.849561 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552282-pdd7d"] Mar 10 10:02:01 crc kubenswrapper[4883]: I0310 10:02:01.366326 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" event={"ID":"38ca7076-03d6-4598-a451-cf485909b9fc","Type":"ContainerStarted","Data":"3c7b236e6abbfc9788ca950dcd363c22954f75dcb93bfc44c7bf284dbb489231"} Mar 10 10:02:02 crc kubenswrapper[4883]: I0310 10:02:02.377562 4883 generic.go:334] "Generic (PLEG): container finished" podID="38ca7076-03d6-4598-a451-cf485909b9fc" containerID="a0e857f4b7d8648de7f831deee2dafbed19a142ef98ff1e54d0826fe04524086" exitCode=0 Mar 10 10:02:02 crc kubenswrapper[4883]: I0310 10:02:02.377666 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" event={"ID":"38ca7076-03d6-4598-a451-cf485909b9fc","Type":"ContainerDied","Data":"a0e857f4b7d8648de7f831deee2dafbed19a142ef98ff1e54d0826fe04524086"} Mar 10 10:02:03 crc kubenswrapper[4883]: I0310 10:02:03.688513 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:03 crc kubenswrapper[4883]: I0310 10:02:03.794373 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/38ca7076-03d6-4598-a451-cf485909b9fc-kube-api-access-wjvdl\") pod \"38ca7076-03d6-4598-a451-cf485909b9fc\" (UID: \"38ca7076-03d6-4598-a451-cf485909b9fc\") " Mar 10 10:02:03 crc kubenswrapper[4883]: I0310 10:02:03.799789 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ca7076-03d6-4598-a451-cf485909b9fc-kube-api-access-wjvdl" (OuterVolumeSpecName: "kube-api-access-wjvdl") pod "38ca7076-03d6-4598-a451-cf485909b9fc" (UID: "38ca7076-03d6-4598-a451-cf485909b9fc"). InnerVolumeSpecName "kube-api-access-wjvdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:02:03 crc kubenswrapper[4883]: I0310 10:02:03.898284 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/38ca7076-03d6-4598-a451-cf485909b9fc-kube-api-access-wjvdl\") on node \"crc\" DevicePath \"\"" Mar 10 10:02:04 crc kubenswrapper[4883]: I0310 10:02:04.397889 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" event={"ID":"38ca7076-03d6-4598-a451-cf485909b9fc","Type":"ContainerDied","Data":"3c7b236e6abbfc9788ca950dcd363c22954f75dcb93bfc44c7bf284dbb489231"} Mar 10 10:02:04 crc kubenswrapper[4883]: I0310 10:02:04.398264 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c7b236e6abbfc9788ca950dcd363c22954f75dcb93bfc44c7bf284dbb489231" Mar 10 10:02:04 crc kubenswrapper[4883]: I0310 10:02:04.397937 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:04 crc kubenswrapper[4883]: I0310 10:02:04.755576 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552276-gr77g"] Mar 10 10:02:04 crc kubenswrapper[4883]: I0310 10:02:04.761449 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552276-gr77g"] Mar 10 10:02:06 crc kubenswrapper[4883]: I0310 10:02:06.088540 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c430570-1b7a-4e38-9a8b-f13d69c18882" path="/var/lib/kubelet/pods/4c430570-1b7a-4e38-9a8b-f13d69c18882/volumes" Mar 10 10:02:07 crc kubenswrapper[4883]: I0310 10:02:07.193050 4883 scope.go:117] "RemoveContainer" containerID="637bb0552a72f7592feca76119ba2d1ac02ce406f7badd427582618ad5b1a1db" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.860215 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bvrxx"] Mar 10 10:02:26 crc kubenswrapper[4883]: E0310 10:02:26.861242 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ca7076-03d6-4598-a451-cf485909b9fc" containerName="oc" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.861257 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ca7076-03d6-4598-a451-cf485909b9fc" containerName="oc" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.861459 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="38ca7076-03d6-4598-a451-cf485909b9fc" containerName="oc" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.862811 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.882491 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvrxx"] Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.974612 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-utilities\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.974886 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-catalog-content\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.974989 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf72p\" (UniqueName: \"kubernetes.io/projected/3aacda7b-a599-43b2-9c44-920593c90e36-kube-api-access-jf72p\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.075555 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-catalog-content\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.075651 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf72p\" (UniqueName: \"kubernetes.io/projected/3aacda7b-a599-43b2-9c44-920593c90e36-kube-api-access-jf72p\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.075719 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-utilities\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.076095 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-utilities\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.076307 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-catalog-content\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.094814 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf72p\" (UniqueName: \"kubernetes.io/projected/3aacda7b-a599-43b2-9c44-920593c90e36-kube-api-access-jf72p\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.183612 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.608406 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvrxx"] Mar 10 10:02:27 crc kubenswrapper[4883]: W0310 10:02:27.612356 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aacda7b_a599_43b2_9c44_920593c90e36.slice/crio-2ea9a53065218527a1d073b08744fda8c491f3eb86123e90ad0000b1e2faa81e WatchSource:0}: Error finding container 2ea9a53065218527a1d073b08744fda8c491f3eb86123e90ad0000b1e2faa81e: Status 404 returned error can't find the container with id 2ea9a53065218527a1d073b08744fda8c491f3eb86123e90ad0000b1e2faa81e Mar 10 10:02:27 crc kubenswrapper[4883]: E0310 10:02:27.972255 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aacda7b_a599_43b2_9c44_920593c90e36.slice/crio-conmon-3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aacda7b_a599_43b2_9c44_920593c90e36.slice/crio-3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a.scope\": RecentStats: unable to find data in memory cache]" Mar 10 10:02:28 crc kubenswrapper[4883]: I0310 10:02:28.608464 4883 generic.go:334] "Generic (PLEG): container finished" podID="3aacda7b-a599-43b2-9c44-920593c90e36" containerID="3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a" exitCode=0 Mar 10 10:02:28 crc kubenswrapper[4883]: I0310 10:02:28.608598 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerDied","Data":"3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a"} Mar 10 10:02:28 crc kubenswrapper[4883]: I0310 10:02:28.608919 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerStarted","Data":"2ea9a53065218527a1d073b08744fda8c491f3eb86123e90ad0000b1e2faa81e"} Mar 10 10:02:29 crc kubenswrapper[4883]: I0310 10:02:29.617524 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerStarted","Data":"4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422"} Mar 10 10:02:30 crc kubenswrapper[4883]: I0310 10:02:30.629646 4883 generic.go:334] "Generic (PLEG): container finished" podID="3aacda7b-a599-43b2-9c44-920593c90e36" containerID="4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422" exitCode=0 Mar 10 10:02:30 crc kubenswrapper[4883]: I0310 10:02:30.629773 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerDied","Data":"4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422"} Mar 10 10:02:31 crc kubenswrapper[4883]: I0310 10:02:31.639076 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerStarted","Data":"90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74"} Mar 10 10:02:31 crc kubenswrapper[4883]: I0310 10:02:31.655034 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bvrxx" podStartSLOduration=3.141951653 podStartE2EDuration="5.655014617s" podCreationTimestamp="2026-03-10 10:02:26 +0000 UTC" firstStartedPulling="2026-03-10 10:02:28.611220243 +0000 UTC m=+3534.866118133" lastFinishedPulling="2026-03-10 10:02:31.124283208 +0000 UTC m=+3537.379181097" observedRunningTime="2026-03-10 10:02:31.651842359 +0000 UTC m=+3537.906740248" watchObservedRunningTime="2026-03-10 10:02:31.655014617 +0000 UTC m=+3537.909912506" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.417731 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8cw7n/must-gather-29w7p"] Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.419815 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.427261 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8cw7n"/"openshift-service-ca.crt" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.427510 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8cw7n"/"kube-root-ca.crt" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.451962 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qphvx\" (UniqueName: \"kubernetes.io/projected/20b00153-6497-4507-8247-81caa30a91bc-kube-api-access-qphvx\") pod \"must-gather-29w7p\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.452159 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20b00153-6497-4507-8247-81caa30a91bc-must-gather-output\") pod \"must-gather-29w7p\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.462123 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8cw7n/must-gather-29w7p"] Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.555124 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qphvx\" (UniqueName: \"kubernetes.io/projected/20b00153-6497-4507-8247-81caa30a91bc-kube-api-access-qphvx\") pod \"must-gather-29w7p\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.555502 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20b00153-6497-4507-8247-81caa30a91bc-must-gather-output\") pod \"must-gather-29w7p\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.555958 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20b00153-6497-4507-8247-81caa30a91bc-must-gather-output\") pod \"must-gather-29w7p\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.574967 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qphvx\" (UniqueName: \"kubernetes.io/projected/20b00153-6497-4507-8247-81caa30a91bc-kube-api-access-qphvx\") pod \"must-gather-29w7p\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.738962 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:34 crc kubenswrapper[4883]: I0310 10:02:34.239519 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8cw7n/must-gather-29w7p"] Mar 10 10:02:34 crc kubenswrapper[4883]: W0310 10:02:34.244108 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20b00153_6497_4507_8247_81caa30a91bc.slice/crio-a0c91bdfe040142ab7c035e1e9087020190db2dfd57aff285cbc0fe251003974 WatchSource:0}: Error finding container a0c91bdfe040142ab7c035e1e9087020190db2dfd57aff285cbc0fe251003974: Status 404 returned error can't find the container with id a0c91bdfe040142ab7c035e1e9087020190db2dfd57aff285cbc0fe251003974 Mar 10 10:02:34 crc kubenswrapper[4883]: I0310 10:02:34.669751 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/must-gather-29w7p" event={"ID":"20b00153-6497-4507-8247-81caa30a91bc","Type":"ContainerStarted","Data":"57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4"} Mar 10 10:02:34 crc kubenswrapper[4883]: I0310 10:02:34.670195 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/must-gather-29w7p" event={"ID":"20b00153-6497-4507-8247-81caa30a91bc","Type":"ContainerStarted","Data":"a0c91bdfe040142ab7c035e1e9087020190db2dfd57aff285cbc0fe251003974"} Mar 10 10:02:35 crc kubenswrapper[4883]: I0310 10:02:35.679291 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/must-gather-29w7p" event={"ID":"20b00153-6497-4507-8247-81caa30a91bc","Type":"ContainerStarted","Data":"212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183"} Mar 10 10:02:35 crc kubenswrapper[4883]: I0310 10:02:35.693067 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8cw7n/must-gather-29w7p" podStartSLOduration=2.693048208 podStartE2EDuration="2.693048208s" podCreationTimestamp="2026-03-10 10:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:02:35.692257349 +0000 UTC m=+3541.947155228" watchObservedRunningTime="2026-03-10 10:02:35.693048208 +0000 UTC m=+3541.947946097" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.184056 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.184407 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.227753 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.582682 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-9f7pq"] Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.584053 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.586452 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8cw7n"/"default-dockercfg-68b9q" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.655571 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62xkz\" (UniqueName: \"kubernetes.io/projected/71aa2959-c8ee-46b8-bd2b-654620fbd99a-kube-api-access-62xkz\") pod \"crc-debug-9f7pq\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.655987 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71aa2959-c8ee-46b8-bd2b-654620fbd99a-host\") pod \"crc-debug-9f7pq\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.731766 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.757976 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62xkz\" (UniqueName: \"kubernetes.io/projected/71aa2959-c8ee-46b8-bd2b-654620fbd99a-kube-api-access-62xkz\") pod \"crc-debug-9f7pq\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.758161 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71aa2959-c8ee-46b8-bd2b-654620fbd99a-host\") pod \"crc-debug-9f7pq\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.758286 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71aa2959-c8ee-46b8-bd2b-654620fbd99a-host\") pod \"crc-debug-9f7pq\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.776914 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62xkz\" (UniqueName: \"kubernetes.io/projected/71aa2959-c8ee-46b8-bd2b-654620fbd99a-kube-api-access-62xkz\") pod \"crc-debug-9f7pq\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.778637 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvrxx"] Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.898827 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:38 crc kubenswrapper[4883]: I0310 10:02:38.704469 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" event={"ID":"71aa2959-c8ee-46b8-bd2b-654620fbd99a","Type":"ContainerStarted","Data":"3bd8ffbe632d745f21bae1bcada8aafee95f3789b72071a8dc962e253bd69366"} Mar 10 10:02:38 crc kubenswrapper[4883]: I0310 10:02:38.705038 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" event={"ID":"71aa2959-c8ee-46b8-bd2b-654620fbd99a","Type":"ContainerStarted","Data":"cb0e60bc99c835c81493440e0b3625b06eace07b336b1d7ae7500b2ea5926dfa"} Mar 10 10:02:38 crc kubenswrapper[4883]: I0310 10:02:38.723441 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" podStartSLOduration=1.7233978410000002 podStartE2EDuration="1.723397841s" podCreationTimestamp="2026-03-10 10:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:02:38.714252149 +0000 UTC m=+3544.969150038" watchObservedRunningTime="2026-03-10 10:02:38.723397841 +0000 UTC m=+3544.978295729" Mar 10 10:02:39 crc kubenswrapper[4883]: I0310 10:02:39.714631 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bvrxx" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="registry-server" containerID="cri-o://90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74" gracePeriod=2 Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.154862 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.210247 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-catalog-content\") pod \"3aacda7b-a599-43b2-9c44-920593c90e36\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.210468 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf72p\" (UniqueName: \"kubernetes.io/projected/3aacda7b-a599-43b2-9c44-920593c90e36-kube-api-access-jf72p\") pod \"3aacda7b-a599-43b2-9c44-920593c90e36\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.210585 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-utilities\") pod \"3aacda7b-a599-43b2-9c44-920593c90e36\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.211771 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-utilities" (OuterVolumeSpecName: "utilities") pod "3aacda7b-a599-43b2-9c44-920593c90e36" (UID: "3aacda7b-a599-43b2-9c44-920593c90e36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.220594 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aacda7b-a599-43b2-9c44-920593c90e36-kube-api-access-jf72p" (OuterVolumeSpecName: "kube-api-access-jf72p") pod "3aacda7b-a599-43b2-9c44-920593c90e36" (UID: "3aacda7b-a599-43b2-9c44-920593c90e36"). InnerVolumeSpecName "kube-api-access-jf72p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.313524 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf72p\" (UniqueName: \"kubernetes.io/projected/3aacda7b-a599-43b2-9c44-920593c90e36-kube-api-access-jf72p\") on node \"crc\" DevicePath \"\"" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.313561 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.723562 4883 generic.go:334] "Generic (PLEG): container finished" podID="3aacda7b-a599-43b2-9c44-920593c90e36" containerID="90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74" exitCode=0 Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.723625 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.723645 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerDied","Data":"90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74"} Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.723992 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerDied","Data":"2ea9a53065218527a1d073b08744fda8c491f3eb86123e90ad0000b1e2faa81e"} Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.724011 4883 scope.go:117] "RemoveContainer" containerID="90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.741513 4883 scope.go:117] "RemoveContainer" containerID="4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.761176 4883 scope.go:117] "RemoveContainer" containerID="3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.808752 4883 scope.go:117] "RemoveContainer" containerID="90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74" Mar 10 10:02:40 crc kubenswrapper[4883]: E0310 10:02:40.809170 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74\": container with ID starting with 90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74 not found: ID does not exist" containerID="90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.809216 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74"} err="failed to get container status \"90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74\": rpc error: code = NotFound desc = could not find container \"90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74\": container with ID starting with 90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74 not found: ID does not exist" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.809246 4883 scope.go:117] "RemoveContainer" containerID="4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422" Mar 10 10:02:40 crc kubenswrapper[4883]: E0310 10:02:40.809865 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422\": container with ID starting with 4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422 not found: ID does not exist" containerID="4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.809899 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422"} err="failed to get container status \"4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422\": rpc error: code = NotFound desc = could not find container \"4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422\": container with ID starting with 4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422 not found: ID does not exist" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.809924 4883 scope.go:117] "RemoveContainer" containerID="3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a" Mar 10 10:02:40 crc kubenswrapper[4883]: E0310 10:02:40.810234 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a\": container with ID starting with 3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a not found: ID does not exist" containerID="3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.810291 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a"} err="failed to get container status \"3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a\": rpc error: code = NotFound desc = could not find container \"3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a\": container with ID starting with 3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a not found: ID does not exist" Mar 10 10:02:41 crc kubenswrapper[4883]: I0310 10:02:41.565786 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3aacda7b-a599-43b2-9c44-920593c90e36" (UID: "3aacda7b-a599-43b2-9c44-920593c90e36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:02:41 crc kubenswrapper[4883]: I0310 10:02:41.643327 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:02:41 crc kubenswrapper[4883]: I0310 10:02:41.654187 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvrxx"] Mar 10 10:02:41 crc kubenswrapper[4883]: I0310 10:02:41.661346 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bvrxx"] Mar 10 10:02:42 crc kubenswrapper[4883]: I0310 10:02:42.090176 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" path="/var/lib/kubelet/pods/3aacda7b-a599-43b2-9c44-920593c90e36/volumes" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.394924 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mhscc"] Mar 10 10:02:59 crc kubenswrapper[4883]: E0310 10:02:59.396700 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="extract-content" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.396729 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="extract-content" Mar 10 10:02:59 crc kubenswrapper[4883]: E0310 10:02:59.396762 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="extract-utilities" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.396768 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="extract-utilities" Mar 10 10:02:59 crc kubenswrapper[4883]: E0310 10:02:59.396783 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="registry-server" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.396788 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="registry-server" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.396976 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="registry-server" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.398280 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.410055 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhscc"] Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.523502 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hngb2\" (UniqueName: \"kubernetes.io/projected/7054bc28-c5d1-41b1-a322-ba547740a357-kube-api-access-hngb2\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.523581 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-utilities\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.523678 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-catalog-content\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.624866 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-utilities\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.625077 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-catalog-content\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.625219 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hngb2\" (UniqueName: \"kubernetes.io/projected/7054bc28-c5d1-41b1-a322-ba547740a357-kube-api-access-hngb2\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.625293 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-utilities\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.625680 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-catalog-content\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.641795 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hngb2\" (UniqueName: \"kubernetes.io/projected/7054bc28-c5d1-41b1-a322-ba547740a357-kube-api-access-hngb2\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.730694 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:00 crc kubenswrapper[4883]: I0310 10:03:00.168118 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhscc"] Mar 10 10:03:00 crc kubenswrapper[4883]: I0310 10:03:00.922789 4883 generic.go:334] "Generic (PLEG): container finished" podID="7054bc28-c5d1-41b1-a322-ba547740a357" containerID="4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2" exitCode=0 Mar 10 10:03:00 crc kubenswrapper[4883]: I0310 10:03:00.923127 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerDied","Data":"4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2"} Mar 10 10:03:00 crc kubenswrapper[4883]: I0310 10:03:00.923193 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerStarted","Data":"0efd33f1cd5c5db073f800f5a2c225b5dab7634b7633268d907da532c5609709"} Mar 10 10:03:00 crc kubenswrapper[4883]: I0310 10:03:00.926837 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 10:03:01 crc kubenswrapper[4883]: I0310 10:03:01.932423 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerStarted","Data":"05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4"} Mar 10 10:03:02 crc kubenswrapper[4883]: I0310 10:03:02.943450 4883 generic.go:334] "Generic (PLEG): container finished" podID="7054bc28-c5d1-41b1-a322-ba547740a357" containerID="05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4" exitCode=0 Mar 10 10:03:02 crc kubenswrapper[4883]: I0310 10:03:02.943668 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerDied","Data":"05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4"} Mar 10 10:03:03 crc kubenswrapper[4883]: I0310 10:03:03.969646 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerStarted","Data":"4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f"} Mar 10 10:03:03 crc kubenswrapper[4883]: I0310 10:03:03.991635 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mhscc" podStartSLOduration=2.448060207 podStartE2EDuration="4.991618148s" podCreationTimestamp="2026-03-10 10:02:59 +0000 UTC" firstStartedPulling="2026-03-10 10:03:00.926618269 +0000 UTC m=+3567.181516159" lastFinishedPulling="2026-03-10 10:03:03.470176211 +0000 UTC m=+3569.725074100" observedRunningTime="2026-03-10 10:03:03.983886309 +0000 UTC m=+3570.238784199" watchObservedRunningTime="2026-03-10 10:03:03.991618148 +0000 UTC m=+3570.246516037" Mar 10 10:03:04 crc kubenswrapper[4883]: I0310 10:03:04.977973 4883 generic.go:334] "Generic (PLEG): container finished" podID="71aa2959-c8ee-46b8-bd2b-654620fbd99a" containerID="3bd8ffbe632d745f21bae1bcada8aafee95f3789b72071a8dc962e253bd69366" exitCode=0 Mar 10 10:03:04 crc kubenswrapper[4883]: I0310 10:03:04.978052 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" event={"ID":"71aa2959-c8ee-46b8-bd2b-654620fbd99a","Type":"ContainerDied","Data":"3bd8ffbe632d745f21bae1bcada8aafee95f3789b72071a8dc962e253bd69366"} Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.073664 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.108057 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-9f7pq"] Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.113520 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-9f7pq"] Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.162466 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71aa2959-c8ee-46b8-bd2b-654620fbd99a-host\") pod \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.162533 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71aa2959-c8ee-46b8-bd2b-654620fbd99a-host" (OuterVolumeSpecName: "host") pod "71aa2959-c8ee-46b8-bd2b-654620fbd99a" (UID: "71aa2959-c8ee-46b8-bd2b-654620fbd99a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.162779 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62xkz\" (UniqueName: \"kubernetes.io/projected/71aa2959-c8ee-46b8-bd2b-654620fbd99a-kube-api-access-62xkz\") pod \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.163199 4883 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71aa2959-c8ee-46b8-bd2b-654620fbd99a-host\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.167516 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71aa2959-c8ee-46b8-bd2b-654620fbd99a-kube-api-access-62xkz" (OuterVolumeSpecName: "kube-api-access-62xkz") pod "71aa2959-c8ee-46b8-bd2b-654620fbd99a" (UID: "71aa2959-c8ee-46b8-bd2b-654620fbd99a"). InnerVolumeSpecName "kube-api-access-62xkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.265860 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62xkz\" (UniqueName: \"kubernetes.io/projected/71aa2959-c8ee-46b8-bd2b-654620fbd99a-kube-api-access-62xkz\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.995648 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb0e60bc99c835c81493440e0b3625b06eace07b336b1d7ae7500b2ea5926dfa" Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.995732 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.304001 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-wsv4n"] Mar 10 10:03:07 crc kubenswrapper[4883]: E0310 10:03:07.304732 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71aa2959-c8ee-46b8-bd2b-654620fbd99a" containerName="container-00" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.304750 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="71aa2959-c8ee-46b8-bd2b-654620fbd99a" containerName="container-00" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.304933 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="71aa2959-c8ee-46b8-bd2b-654620fbd99a" containerName="container-00" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.305597 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.307202 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8cw7n"/"default-dockercfg-68b9q" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.389090 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f05e8636-7b95-4487-bd28-96cb3159b18e-host\") pod \"crc-debug-wsv4n\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.389259 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnm4n\" (UniqueName: \"kubernetes.io/projected/f05e8636-7b95-4487-bd28-96cb3159b18e-kube-api-access-cnm4n\") pod \"crc-debug-wsv4n\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.490356 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnm4n\" (UniqueName: \"kubernetes.io/projected/f05e8636-7b95-4487-bd28-96cb3159b18e-kube-api-access-cnm4n\") pod \"crc-debug-wsv4n\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.490443 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f05e8636-7b95-4487-bd28-96cb3159b18e-host\") pod \"crc-debug-wsv4n\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.490631 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f05e8636-7b95-4487-bd28-96cb3159b18e-host\") pod \"crc-debug-wsv4n\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.509379 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnm4n\" (UniqueName: \"kubernetes.io/projected/f05e8636-7b95-4487-bd28-96cb3159b18e-kube-api-access-cnm4n\") pod \"crc-debug-wsv4n\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.661224 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: W0310 10:03:07.693618 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf05e8636_7b95_4487_bd28_96cb3159b18e.slice/crio-da59adcc4e4d5489a29e754cdb9e1c48752eb1c5bb76d4b10477711918db40ce WatchSource:0}: Error finding container da59adcc4e4d5489a29e754cdb9e1c48752eb1c5bb76d4b10477711918db40ce: Status 404 returned error can't find the container with id da59adcc4e4d5489a29e754cdb9e1c48752eb1c5bb76d4b10477711918db40ce Mar 10 10:03:08 crc kubenswrapper[4883]: I0310 10:03:08.016241 4883 generic.go:334] "Generic (PLEG): container finished" podID="f05e8636-7b95-4487-bd28-96cb3159b18e" containerID="5f767fd40e7a1f328bfb19f2b7318e01d59fdbb800fcd79f270e6c2ebac7a271" exitCode=0 Mar 10 10:03:08 crc kubenswrapper[4883]: I0310 10:03:08.016467 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" event={"ID":"f05e8636-7b95-4487-bd28-96cb3159b18e","Type":"ContainerDied","Data":"5f767fd40e7a1f328bfb19f2b7318e01d59fdbb800fcd79f270e6c2ebac7a271"} Mar 10 10:03:08 crc kubenswrapper[4883]: I0310 10:03:08.016506 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" event={"ID":"f05e8636-7b95-4487-bd28-96cb3159b18e","Type":"ContainerStarted","Data":"da59adcc4e4d5489a29e754cdb9e1c48752eb1c5bb76d4b10477711918db40ce"} Mar 10 10:03:08 crc kubenswrapper[4883]: I0310 10:03:08.088840 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71aa2959-c8ee-46b8-bd2b-654620fbd99a" path="/var/lib/kubelet/pods/71aa2959-c8ee-46b8-bd2b-654620fbd99a/volumes" Mar 10 10:03:08 crc kubenswrapper[4883]: I0310 10:03:08.445046 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-wsv4n"] Mar 10 10:03:08 crc kubenswrapper[4883]: I0310 10:03:08.452607 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-wsv4n"] Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.107937 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.228148 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f05e8636-7b95-4487-bd28-96cb3159b18e-host\") pod \"f05e8636-7b95-4487-bd28-96cb3159b18e\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.228456 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnm4n\" (UniqueName: \"kubernetes.io/projected/f05e8636-7b95-4487-bd28-96cb3159b18e-kube-api-access-cnm4n\") pod \"f05e8636-7b95-4487-bd28-96cb3159b18e\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.228276 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f05e8636-7b95-4487-bd28-96cb3159b18e-host" (OuterVolumeSpecName: "host") pod "f05e8636-7b95-4487-bd28-96cb3159b18e" (UID: "f05e8636-7b95-4487-bd28-96cb3159b18e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.229567 4883 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f05e8636-7b95-4487-bd28-96cb3159b18e-host\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.234026 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05e8636-7b95-4487-bd28-96cb3159b18e-kube-api-access-cnm4n" (OuterVolumeSpecName: "kube-api-access-cnm4n") pod "f05e8636-7b95-4487-bd28-96cb3159b18e" (UID: "f05e8636-7b95-4487-bd28-96cb3159b18e"). InnerVolumeSpecName "kube-api-access-cnm4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.332417 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnm4n\" (UniqueName: \"kubernetes.io/projected/f05e8636-7b95-4487-bd28-96cb3159b18e-kube-api-access-cnm4n\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.616992 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-q6vmd"] Mar 10 10:03:09 crc kubenswrapper[4883]: E0310 10:03:09.617654 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05e8636-7b95-4487-bd28-96cb3159b18e" containerName="container-00" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.617670 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05e8636-7b95-4487-bd28-96cb3159b18e" containerName="container-00" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.617859 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05e8636-7b95-4487-bd28-96cb3159b18e" containerName="container-00" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.618454 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.731639 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.731698 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.741385 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49rqf\" (UniqueName: \"kubernetes.io/projected/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-kube-api-access-49rqf\") pod \"crc-debug-q6vmd\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.741918 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-host\") pod \"crc-debug-q6vmd\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.768060 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.843516 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-host\") pod \"crc-debug-q6vmd\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.843657 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-host\") pod \"crc-debug-q6vmd\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.843749 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49rqf\" (UniqueName: \"kubernetes.io/projected/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-kube-api-access-49rqf\") pod \"crc-debug-q6vmd\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.859646 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49rqf\" (UniqueName: \"kubernetes.io/projected/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-kube-api-access-49rqf\") pod \"crc-debug-q6vmd\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.931770 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:10 crc kubenswrapper[4883]: I0310 10:03:10.034170 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da59adcc4e4d5489a29e754cdb9e1c48752eb1c5bb76d4b10477711918db40ce" Mar 10 10:03:10 crc kubenswrapper[4883]: I0310 10:03:10.034183 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:10 crc kubenswrapper[4883]: I0310 10:03:10.035651 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" event={"ID":"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5","Type":"ContainerStarted","Data":"d795a2b4d8eabba9d1a1bb4399248590bb62fc636940c89f84c73983260a5143"} Mar 10 10:03:10 crc kubenswrapper[4883]: I0310 10:03:10.088814 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05e8636-7b95-4487-bd28-96cb3159b18e" path="/var/lib/kubelet/pods/f05e8636-7b95-4487-bd28-96cb3159b18e/volumes" Mar 10 10:03:10 crc kubenswrapper[4883]: I0310 10:03:10.089550 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:10 crc kubenswrapper[4883]: I0310 10:03:10.152754 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhscc"] Mar 10 10:03:11 crc kubenswrapper[4883]: I0310 10:03:11.044056 4883 generic.go:334] "Generic (PLEG): container finished" podID="5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" containerID="0b5b985860f6b61a235555a89c89e510ababeb5a50b400ad61a81c43c2bb3059" exitCode=0 Mar 10 10:03:11 crc kubenswrapper[4883]: I0310 10:03:11.044112 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" event={"ID":"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5","Type":"ContainerDied","Data":"0b5b985860f6b61a235555a89c89e510ababeb5a50b400ad61a81c43c2bb3059"} Mar 10 10:03:11 crc kubenswrapper[4883]: I0310 10:03:11.076205 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-q6vmd"] Mar 10 10:03:11 crc kubenswrapper[4883]: I0310 10:03:11.091894 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-q6vmd"] Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.050586 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mhscc" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="registry-server" containerID="cri-o://4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f" gracePeriod=2 Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.231589 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.293040 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49rqf\" (UniqueName: \"kubernetes.io/projected/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-kube-api-access-49rqf\") pod \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.293182 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-host\") pod \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.293306 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-host" (OuterVolumeSpecName: "host") pod "5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" (UID: "5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.293772 4883 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-host\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.299577 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-kube-api-access-49rqf" (OuterVolumeSpecName: "kube-api-access-49rqf") pod "5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" (UID: "5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5"). InnerVolumeSpecName "kube-api-access-49rqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.392154 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.396062 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49rqf\" (UniqueName: \"kubernetes.io/projected/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-kube-api-access-49rqf\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.497959 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-catalog-content\") pod \"7054bc28-c5d1-41b1-a322-ba547740a357\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.498360 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-utilities\") pod \"7054bc28-c5d1-41b1-a322-ba547740a357\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.498598 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hngb2\" (UniqueName: \"kubernetes.io/projected/7054bc28-c5d1-41b1-a322-ba547740a357-kube-api-access-hngb2\") pod \"7054bc28-c5d1-41b1-a322-ba547740a357\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.499630 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-utilities" (OuterVolumeSpecName: "utilities") pod "7054bc28-c5d1-41b1-a322-ba547740a357" (UID: "7054bc28-c5d1-41b1-a322-ba547740a357"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.502081 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7054bc28-c5d1-41b1-a322-ba547740a357-kube-api-access-hngb2" (OuterVolumeSpecName: "kube-api-access-hngb2") pod "7054bc28-c5d1-41b1-a322-ba547740a357" (UID: "7054bc28-c5d1-41b1-a322-ba547740a357"). InnerVolumeSpecName "kube-api-access-hngb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.602403 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hngb2\" (UniqueName: \"kubernetes.io/projected/7054bc28-c5d1-41b1-a322-ba547740a357-kube-api-access-hngb2\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.602813 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.789335 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7054bc28-c5d1-41b1-a322-ba547740a357" (UID: "7054bc28-c5d1-41b1-a322-ba547740a357"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.806731 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.072270 4883 generic.go:334] "Generic (PLEG): container finished" podID="7054bc28-c5d1-41b1-a322-ba547740a357" containerID="4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f" exitCode=0 Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.072348 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerDied","Data":"4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f"} Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.072407 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerDied","Data":"0efd33f1cd5c5db073f800f5a2c225b5dab7634b7633268d907da532c5609709"} Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.072440 4883 scope.go:117] "RemoveContainer" containerID="4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.074225 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.078372 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.101086 4883 scope.go:117] "RemoveContainer" containerID="05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.110780 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhscc"] Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.119585 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mhscc"] Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.119939 4883 scope.go:117] "RemoveContainer" containerID="4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.165143 4883 scope.go:117] "RemoveContainer" containerID="4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f" Mar 10 10:03:13 crc kubenswrapper[4883]: E0310 10:03:13.165709 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f\": container with ID starting with 4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f not found: ID does not exist" containerID="4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.165773 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f"} err="failed to get container status \"4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f\": rpc error: code = NotFound desc = could not find container \"4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f\": container with ID starting with 4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f not found: ID does not exist" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.165804 4883 scope.go:117] "RemoveContainer" containerID="05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4" Mar 10 10:03:13 crc kubenswrapper[4883]: E0310 10:03:13.166165 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4\": container with ID starting with 05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4 not found: ID does not exist" containerID="05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.166207 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4"} err="failed to get container status \"05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4\": rpc error: code = NotFound desc = could not find container \"05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4\": container with ID starting with 05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4 not found: ID does not exist" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.166232 4883 scope.go:117] "RemoveContainer" containerID="4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2" Mar 10 10:03:13 crc kubenswrapper[4883]: E0310 10:03:13.168797 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2\": container with ID starting with 4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2 not found: ID does not exist" containerID="4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.168844 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2"} err="failed to get container status \"4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2\": rpc error: code = NotFound desc = could not find container \"4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2\": container with ID starting with 4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2 not found: ID does not exist" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.168876 4883 scope.go:117] "RemoveContainer" containerID="0b5b985860f6b61a235555a89c89e510ababeb5a50b400ad61a81c43c2bb3059" Mar 10 10:03:14 crc kubenswrapper[4883]: I0310 10:03:14.094567 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" path="/var/lib/kubelet/pods/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5/volumes" Mar 10 10:03:14 crc kubenswrapper[4883]: I0310 10:03:14.095535 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" path="/var/lib/kubelet/pods/7054bc28-c5d1-41b1-a322-ba547740a357/volumes" Mar 10 10:03:17 crc kubenswrapper[4883]: I0310 10:03:17.448973 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:03:17 crc kubenswrapper[4883]: I0310 10:03:17.449361 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:03:38 crc kubenswrapper[4883]: I0310 10:03:38.518229 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b55764b68-l794s_17a46674-c6ec-4128-8285-d71c228d11c8/barbican-api/0.log" Mar 10 10:03:38 crc kubenswrapper[4883]: I0310 10:03:38.677579 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b55764b68-l794s_17a46674-c6ec-4128-8285-d71c228d11c8/barbican-api-log/0.log" Mar 10 10:03:38 crc kubenswrapper[4883]: I0310 10:03:38.692325 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f87cd7fb6-jz6ch_dce7df3b-5f31-4732-8d27-8e06dc07824d/barbican-keystone-listener/0.log" Mar 10 10:03:38 crc kubenswrapper[4883]: I0310 10:03:38.704195 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f87cd7fb6-jz6ch_dce7df3b-5f31-4732-8d27-8e06dc07824d/barbican-keystone-listener-log/0.log" Mar 10 10:03:38 crc kubenswrapper[4883]: I0310 10:03:38.868930 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d48955d69-pvbn8_221490bc-406a-436f-8705-66106ed6bbe0/barbican-worker/0.log" Mar 10 10:03:38 crc kubenswrapper[4883]: I0310 10:03:38.880573 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d48955d69-pvbn8_221490bc-406a-436f-8705-66106ed6bbe0/barbican-worker-log/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.163023 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r_de8c98db-31db-4ecd-83f2-c53d4bdd2ddd/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.205766 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/ceilometer-central-agent/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.276600 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/ceilometer-notification-agent/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.329429 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/proxy-httpd/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.383623 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/sg-core/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.499077 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2caba5f6-d05e-437e-868c-952e8adf3278/cinder-api/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.540004 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2caba5f6-d05e-437e-868c-952e8adf3278/cinder-api-log/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.694198 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a7bae0a1-9bb8-47ba-a161-764cd7406992/probe/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.721231 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a7bae0a1-9bb8-47ba-a161-764cd7406992/cinder-scheduler/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.818633 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm_07ddb6af-f2c7-46eb-aac4-fe69996caf27/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.927636 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh_269dd9c8-3d75-4892-9f75-c4fe1b9093b8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.011962 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-rx8gc_da34e0af-a084-40fb-93ea-471923c49051/init/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.175105 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-rx8gc_da34e0af-a084-40fb-93ea-471923c49051/init/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.213972 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-rx8gc_da34e0af-a084-40fb-93ea-471923c49051/dnsmasq-dns/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.228753 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr_2428d4e5-b48e-45ad-9bfb-711c3b1e8471/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.393697 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4c354fe8-851f-4cf4-bc13-e06dba0a1cc0/glance-httpd/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.415527 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4c354fe8-851f-4cf4-bc13-e06dba0a1cc0/glance-log/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.572247 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_676458e7-e4a0-4f1a-b200-0ab75faaddb4/glance-httpd/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.579336 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_676458e7-e4a0-4f1a-b200-0ab75faaddb4/glance-log/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.692022 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-555c96ddb-t7tcm_ef0598ad-c7ea-4645-b553-7d9028397156/horizon/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.826168 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9_7e9f7531-37e1-4284-94ac-cada3d2fc301/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.031233 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-555c96ddb-t7tcm_ef0598ad-c7ea-4645-b553-7d9028397156/horizon-log/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.037484 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kglh5_361b2613-f26e-45c3-aabe-9a0f115e8e10/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.234682 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-744f4576f6-kglt9_c6effa97-6f88-4706-98bc-b51af01bd993/keystone-api/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.481796 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29552281-kbhqs_845ef92d-2dae-49c8-823f-9e3fe2735d79/keystone-cron/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.532394 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_39c373dd-952a-4305-82ed-1d047c7a859f/kube-state-metrics/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.640219 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-thhsw_eb3b72a2-945a-4719-87c0-ffaf7eb84b52/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.990721 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7b5fb6fc5c-pj985_82fb8a17-1c35-415a-8a5d-478730286eb1/neutron-httpd/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.012816 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7b5fb6fc5c-pj985_82fb8a17-1c35-415a-8a5d-478730286eb1/neutron-api/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.091029 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4_d37d0afe-ad64-4616-b877-bd05deefd038/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.612380 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_14ae000e-33d5-4caa-8b61-dd1ab03b9978/nova-api-log/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.680002 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_19096ebe-3796-4e22-a477-45d3e635a80a/nova-cell0-conductor-conductor/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.842613 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_90b06d82-9f07-4c29-9bad-987d2c6d027c/nova-cell1-conductor-conductor/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.941564 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_14ae000e-33d5-4caa-8b61-dd1ab03b9978/nova-api-api/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.986938 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2c5d710c-62fb-4a8c-8a5c-ec6709017c75/nova-cell1-novncproxy-novncproxy/0.log" Mar 10 10:03:43 crc kubenswrapper[4883]: I0310 10:03:43.118452 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-47dxf_af134b73-8c24-4b9e-b15e-48ff4b83ecd4/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:43 crc kubenswrapper[4883]: I0310 10:03:43.310789 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0743bd84-b1d5-4634-9a7f-2c9daf2a5994/nova-metadata-log/0.log" Mar 10 10:03:43 crc kubenswrapper[4883]: I0310 10:03:43.623896 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_287f174d-514a-4c8c-a70e-b6e64fe41653/mysql-bootstrap/0.log" Mar 10 10:03:43 crc kubenswrapper[4883]: I0310 10:03:43.629146 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_626b3115-ced1-45ea-8401-e2bd7e79a20c/nova-scheduler-scheduler/0.log" Mar 10 10:03:43 crc kubenswrapper[4883]: I0310 10:03:43.805751 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_287f174d-514a-4c8c-a70e-b6e64fe41653/mysql-bootstrap/0.log" Mar 10 10:03:43 crc kubenswrapper[4883]: I0310 10:03:43.826129 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_287f174d-514a-4c8c-a70e-b6e64fe41653/galera/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.004336 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5dae6834-0ed6-4043-9efe-91745925591a/mysql-bootstrap/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.260282 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5dae6834-0ed6-4043-9efe-91745925591a/mysql-bootstrap/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.290906 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5dae6834-0ed6-4043-9efe-91745925591a/galera/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.485251 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0743bd84-b1d5-4634-9a7f-2c9daf2a5994/nova-metadata-metadata/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.584916 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_166b0c95-d44f-41e4-b27a-01e549dfb9d2/openstackclient/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.630626 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lb2z9_6691939e-adb0-420c-bf9e-f4a9b670c83b/ovn-controller/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.793729 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovsdb-server-init/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.840600 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-b2z2p_570aed6d-03dc-4ad5-b0e1-c6efc4facabb/openstack-network-exporter/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.982559 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovs-vswitchd/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.052024 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovsdb-server/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.077079 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovsdb-server-init/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.178003 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7cqkz_bbcde384-73a5-48c3-a5fb-226d671707cb/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.265028 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b47099e9-f945-4873-a704-ee55b0f0ac46/ovn-northd/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.302913 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b47099e9-f945-4873-a704-ee55b0f0ac46/openstack-network-exporter/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.474877 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_be383ddb-b33d-4129-acf8-1ffbbc21b1d4/openstack-network-exporter/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.498432 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_be383ddb-b33d-4129-acf8-1ffbbc21b1d4/ovsdbserver-nb/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.622661 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_249a9bf5-ef0f-4209-855e-3fa422106519/openstack-network-exporter/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.647711 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_249a9bf5-ef0f-4209-855e-3fa422106519/ovsdbserver-sb/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.861849 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5946656968-5mzlm_309c3af5-db30-48b8-8118-471950b7312c/placement-log/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.889904 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5946656968-5mzlm_309c3af5-db30-48b8-8118-471950b7312c/placement-api/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.906933 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_170c41ad-d10f-4567-97ec-2b90d149951b/setup-container/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.115644 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_170c41ad-d10f-4567-97ec-2b90d149951b/setup-container/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.190080 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_170c41ad-d10f-4567-97ec-2b90d149951b/rabbitmq/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.196745 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1be05788-71cf-486a-8142-e317e959bfe9/setup-container/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.381072 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1be05788-71cf-486a-8142-e317e959bfe9/rabbitmq/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.398834 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1be05788-71cf-486a-8142-e317e959bfe9/setup-container/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.488509 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz_0efdf39d-2133-4aaf-9fec-2b50533d3cae/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.686903 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pf4n9_d3461a81-abbe-4c3e-88ca-42eff1eeb14e/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.715094 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7_4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.877709 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rlqjc_61bb4cc5-1d4f-4439-a00e-4b2e27d4802b/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.925684 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-v5v84_caa69332-97ab-4629-900f-1596af363ba4/ssh-known-hosts-edpm-deployment/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.169802 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6656f7cc-nv5pp_ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b/proxy-server/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.182108 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-n4vhh_cbe93226-96c7-4854-abdc-4afe54ad7ad5/swift-ring-rebalance/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.225064 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6656f7cc-nv5pp_ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b/proxy-httpd/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.448585 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.448653 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.550124 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-auditor/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.577520 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-reaper/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.650186 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-replicator/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.767231 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-server/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.769634 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-auditor/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.795720 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-replicator/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.871807 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-server/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.942523 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-auditor/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.959525 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-updater/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.010245 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-expirer/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.097048 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-replicator/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.142224 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-updater/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.149458 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-server/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.267612 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/rsync/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.300309 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/swift-recon-cron/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.386619 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-blk56_b083d3b3-edb7-4d2f-a7b7-f1275bd83fde/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.525007 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d483d791-15b3-49e7-8095-5660a9d0fdaa/tempest-tests-tempest-tests-runner/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.589712 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4d76dec9-afd2-4850-aacb-c8d60819fc1e/test-operator-logs-container/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.756604 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp_20e06399-dd26-4a60-a6b7-261cc4505a92/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:59 crc kubenswrapper[4883]: I0310 10:03:59.361523 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_52bdcacc-ce19-418b-871c-35482038da29/memcached/0.log" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.138612 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552284-6w2t6"] Mar 10 10:04:00 crc kubenswrapper[4883]: E0310 10:04:00.139209 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" containerName="container-00" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.139228 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" containerName="container-00" Mar 10 10:04:00 crc kubenswrapper[4883]: E0310 10:04:00.139243 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="registry-server" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.139249 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="registry-server" Mar 10 10:04:00 crc kubenswrapper[4883]: E0310 10:04:00.139259 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="extract-utilities" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.139266 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="extract-utilities" Mar 10 10:04:00 crc kubenswrapper[4883]: E0310 10:04:00.139288 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="extract-content" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.139294 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="extract-content" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.139529 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="registry-server" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.139564 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" containerName="container-00" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.140171 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.141933 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.142265 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.142799 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.147286 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552284-6w2t6"] Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.287004 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-465nq\" (UniqueName: \"kubernetes.io/projected/00b10c19-3e2e-4f4a-812f-bdfaa0415a7e-kube-api-access-465nq\") pod \"auto-csr-approver-29552284-6w2t6\" (UID: \"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e\") " pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.389695 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-465nq\" (UniqueName: \"kubernetes.io/projected/00b10c19-3e2e-4f4a-812f-bdfaa0415a7e-kube-api-access-465nq\") pod \"auto-csr-approver-29552284-6w2t6\" (UID: \"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e\") " pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.406273 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-465nq\" (UniqueName: \"kubernetes.io/projected/00b10c19-3e2e-4f4a-812f-bdfaa0415a7e-kube-api-access-465nq\") pod \"auto-csr-approver-29552284-6w2t6\" (UID: \"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e\") " pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.457131 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.864719 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552284-6w2t6"] Mar 10 10:04:01 crc kubenswrapper[4883]: I0310 10:04:01.541599 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" event={"ID":"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e","Type":"ContainerStarted","Data":"1a8589d54c69ed12440517871fa245b292af6750147579b2c23f96816d3a02e5"} Mar 10 10:04:02 crc kubenswrapper[4883]: I0310 10:04:02.552573 4883 generic.go:334] "Generic (PLEG): container finished" podID="00b10c19-3e2e-4f4a-812f-bdfaa0415a7e" containerID="267d7220e198001e5b7f146d949413df18a5c26f7173f9289b7707d0d7351557" exitCode=0 Mar 10 10:04:02 crc kubenswrapper[4883]: I0310 10:04:02.552665 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" event={"ID":"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e","Type":"ContainerDied","Data":"267d7220e198001e5b7f146d949413df18a5c26f7173f9289b7707d0d7351557"} Mar 10 10:04:03 crc kubenswrapper[4883]: I0310 10:04:03.838082 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:03 crc kubenswrapper[4883]: I0310 10:04:03.953928 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-465nq\" (UniqueName: \"kubernetes.io/projected/00b10c19-3e2e-4f4a-812f-bdfaa0415a7e-kube-api-access-465nq\") pod \"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e\" (UID: \"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e\") " Mar 10 10:04:03 crc kubenswrapper[4883]: I0310 10:04:03.961263 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b10c19-3e2e-4f4a-812f-bdfaa0415a7e-kube-api-access-465nq" (OuterVolumeSpecName: "kube-api-access-465nq") pod "00b10c19-3e2e-4f4a-812f-bdfaa0415a7e" (UID: "00b10c19-3e2e-4f4a-812f-bdfaa0415a7e"). InnerVolumeSpecName "kube-api-access-465nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:04:04 crc kubenswrapper[4883]: I0310 10:04:04.058390 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-465nq\" (UniqueName: \"kubernetes.io/projected/00b10c19-3e2e-4f4a-812f-bdfaa0415a7e-kube-api-access-465nq\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:04 crc kubenswrapper[4883]: I0310 10:04:04.571103 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" event={"ID":"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e","Type":"ContainerDied","Data":"1a8589d54c69ed12440517871fa245b292af6750147579b2c23f96816d3a02e5"} Mar 10 10:04:04 crc kubenswrapper[4883]: I0310 10:04:04.571164 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a8589d54c69ed12440517871fa245b292af6750147579b2c23f96816d3a02e5" Mar 10 10:04:04 crc kubenswrapper[4883]: I0310 10:04:04.571169 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:04 crc kubenswrapper[4883]: I0310 10:04:04.909109 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552278-mmnzb"] Mar 10 10:04:04 crc kubenswrapper[4883]: I0310 10:04:04.918197 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552278-mmnzb"] Mar 10 10:04:06 crc kubenswrapper[4883]: I0310 10:04:06.091612 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63837d10-3c84-4972-98da-7415e14f2594" path="/var/lib/kubelet/pods/63837d10-3c84-4972-98da-7415e14f2594/volumes" Mar 10 10:04:07 crc kubenswrapper[4883]: I0310 10:04:07.282103 4883 scope.go:117] "RemoveContainer" containerID="d248878325804477b2b46157dab9cab4990cb786e7cd390c24f00599d57f6825" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.193919 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/util/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.358798 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/util/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.366828 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/pull/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.384971 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/pull/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.517963 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/util/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.529709 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/pull/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.552984 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/extract/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.968799 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-h2cxw_9a394c48-31ca-4e99-b210-45ae6f67faaa/manager/0.log" Mar 10 10:04:12 crc kubenswrapper[4883]: I0310 10:04:12.267828 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-w9dbp_63474f68-d09d-4822-b650-96a37aead592/manager/0.log" Mar 10 10:04:12 crc kubenswrapper[4883]: I0310 10:04:12.376678 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-mbxnn_bf027c79-6bdb-4cfb-8c31-d785b80e2231/manager/0.log" Mar 10 10:04:12 crc kubenswrapper[4883]: I0310 10:04:12.559092 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-fvwbt_8a4cb5eb-0894-440e-8cfd-448651696a6f/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.020389 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-txdwh_884f7bcb-08ef-49f3-912b-ca921e342615/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.082059 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-nzdsk_09a04267-a914-4c55-add8-735a053038d3/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.166532 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-v6p2d_c994e4ad-140c-4655-ad69-e4013406d12e/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.326726 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-v5kxw_ad93994a-26d2-4353-80be-456c1311020e/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.413279 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-dgrlb_8b177c77-d85f-4374-b6db-a700719c1282/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.735189 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-kz9sv_ec624ec4-966f-410c-95c7-73be0f9cad27/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.784358 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-snvh5_91415f40-08a2-451b-abe8-38c7b447e66f/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.992468 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-rpwdx_760c8dff-c64a-492b-a778-45ef16d197bd/manager/0.log" Mar 10 10:04:14 crc kubenswrapper[4883]: I0310 10:04:14.056757 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-49gjk_d0e08342-2d1b-42d9-921e-1d948f701a58/manager/0.log" Mar 10 10:04:14 crc kubenswrapper[4883]: I0310 10:04:14.218318 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6647d7885f9f2px_2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f/manager/0.log" Mar 10 10:04:14 crc kubenswrapper[4883]: I0310 10:04:14.519696 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6cf8df7788-tzrb8_31e7ec33-4b44-48ce-9f01-e483a7668dd6/operator/0.log" Mar 10 10:04:14 crc kubenswrapper[4883]: I0310 10:04:14.765242 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-c4vjl_83852eec-509b-4074-b837-4f00d1d07d05/registry-server/0.log" Mar 10 10:04:14 crc kubenswrapper[4883]: I0310 10:04:14.866562 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-qnwgj_c13f33e2-dd6a-4ca0-91e7-5489c753e273/manager/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.047869 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-pppd9_04b3aecb-7cfd-4042-b003-4bc8c339aff8/manager/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.186879 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pjjsn_475c1190-6d94-431a-943d-4e749ea87d6b/operator/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.302033 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-m6wph_1b429bd6-00de-4cc2-8a18-9f58897b6834/manager/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.521002 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-mkjnt_3f4c2998-b51a-4620-b674-60bb0817eb7d/manager/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.592573 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-8mpp4_d3d3c04d-7e05-4df2-85c6-394d0bde1a69/manager/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.727943 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-rkjsw_a7216675-a296-4faa-9dd5-d857b15ffa3c/manager/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.838630 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6679ddfdc7-9ntl4_969b2d39-fb99-42df-8e6e-3ded5cd292c8/manager/0.log" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.448715 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.449189 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.449267 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.450686 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.450769 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" gracePeriod=600 Mar 10 10:04:17 crc kubenswrapper[4883]: E0310 10:04:17.593064 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.685239 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" exitCode=0 Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.685288 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24"} Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.685450 4883 scope.go:117] "RemoveContainer" containerID="02853109fcbc3c4b53e8c3e9045a6ad761bbdd1ba2cc93d6639942e6c10801e3" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.686085 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:04:17 crc kubenswrapper[4883]: E0310 10:04:17.687888 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.704623 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-q52nj_ac18771f-5f45-40d8-b275-38e2e1c48ba6/manager/0.log" Mar 10 10:04:28 crc kubenswrapper[4883]: I0310 10:04:28.080304 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:04:28 crc kubenswrapper[4883]: E0310 10:04:28.081410 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:04:32 crc kubenswrapper[4883]: I0310 10:04:32.104413 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dlh8b_7ec510e9-f96b-44da-abec-7d49115d0c83/control-plane-machine-set-operator/0.log" Mar 10 10:04:32 crc kubenswrapper[4883]: I0310 10:04:32.248813 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7clc9_3de74a75-4aa1-46dd-ae5b-5c82b91811e5/kube-rbac-proxy/0.log" Mar 10 10:04:32 crc kubenswrapper[4883]: I0310 10:04:32.260349 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7clc9_3de74a75-4aa1-46dd-ae5b-5c82b91811e5/machine-api-operator/0.log" Mar 10 10:04:42 crc kubenswrapper[4883]: I0310 10:04:42.676668 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-kl2rd_1c0c9250-e9df-4898-bd0e-91919353a3f6/cert-manager-controller/0.log" Mar 10 10:04:42 crc kubenswrapper[4883]: I0310 10:04:42.811740 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-n2g9x_b92cb5d0-214a-49a6-b9b7-f210fef36956/cert-manager-cainjector/0.log" Mar 10 10:04:42 crc kubenswrapper[4883]: I0310 10:04:42.850598 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-dfhh4_f33cf1b9-ce0d-41f4-8f36-1b159badc41e/cert-manager-webhook/0.log" Mar 10 10:04:43 crc kubenswrapper[4883]: I0310 10:04:43.079981 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:04:43 crc kubenswrapper[4883]: E0310 10:04:43.080229 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:04:53 crc kubenswrapper[4883]: I0310 10:04:53.416243 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-mr8tf_805fc4e3-bab7-415e-a190-0ceeda5bd8b7/nmstate-console-plugin/0.log" Mar 10 10:04:53 crc kubenswrapper[4883]: I0310 10:04:53.564106 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5lcxd_d9c7e9ee-a0a0-4afe-bd00-872553ca9b32/nmstate-handler/0.log" Mar 10 10:04:53 crc kubenswrapper[4883]: I0310 10:04:53.611594 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-x5lcq_291985dd-d623-46ba-9e1b-056dc17d26ed/kube-rbac-proxy/0.log" Mar 10 10:04:53 crc kubenswrapper[4883]: I0310 10:04:53.664916 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-x5lcq_291985dd-d623-46ba-9e1b-056dc17d26ed/nmstate-metrics/0.log" Mar 10 10:04:53 crc kubenswrapper[4883]: I0310 10:04:53.753520 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-k4v4s_a776287a-5b99-4f43-8d4c-191108392859/nmstate-operator/0.log" Mar 10 10:04:53 crc kubenswrapper[4883]: I0310 10:04:53.824588 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-ccbds_10ab1e00-47a1-4f9a-a55a-131935759d8d/nmstate-webhook/0.log" Mar 10 10:04:54 crc kubenswrapper[4883]: I0310 10:04:54.085143 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:04:54 crc kubenswrapper[4883]: E0310 10:04:54.085438 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:05:08 crc kubenswrapper[4883]: I0310 10:05:08.079703 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:05:08 crc kubenswrapper[4883]: E0310 10:05:08.080390 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:05:15 crc kubenswrapper[4883]: I0310 10:05:15.683065 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-rtrbh_59437559-8b42-4779-8b72-17f09b50b572/kube-rbac-proxy/0.log" Mar 10 10:05:15 crc kubenswrapper[4883]: I0310 10:05:15.747676 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-rtrbh_59437559-8b42-4779-8b72-17f09b50b572/controller/0.log" Mar 10 10:05:15 crc kubenswrapper[4883]: I0310 10:05:15.896944 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.080699 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.083457 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.107998 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.131762 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.255275 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.272201 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.276501 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.290818 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.438667 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.445663 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.449674 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.464703 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/controller/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.602856 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/frr-metrics/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.606943 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/kube-rbac-proxy/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.623010 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/kube-rbac-proxy-frr/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.800521 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/reloader/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.836677 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-shjnr_8e843a56-715a-44fc-9974-8570d49bd9a0/frr-k8s-webhook-server/0.log" Mar 10 10:05:17 crc kubenswrapper[4883]: I0310 10:05:17.006698 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c79cc77cd-s6vgn_5804aa0d-ee19-4fb3-bd39-27c7103571d8/manager/0.log" Mar 10 10:05:17 crc kubenswrapper[4883]: I0310 10:05:17.202886 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57848ff665-prp4d_cb05036e-52f2-48ab-ba84-f89c4565a0af/webhook-server/0.log" Mar 10 10:05:17 crc kubenswrapper[4883]: I0310 10:05:17.320965 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gtqfn_6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf/kube-rbac-proxy/0.log" Mar 10 10:05:17 crc kubenswrapper[4883]: I0310 10:05:17.746296 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gtqfn_6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf/speaker/0.log" Mar 10 10:05:18 crc kubenswrapper[4883]: I0310 10:05:18.017082 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/frr/0.log" Mar 10 10:05:23 crc kubenswrapper[4883]: I0310 10:05:23.080545 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:05:23 crc kubenswrapper[4883]: E0310 10:05:23.081170 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:05:28 crc kubenswrapper[4883]: I0310 10:05:28.584172 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/util/0.log" Mar 10 10:05:28 crc kubenswrapper[4883]: I0310 10:05:28.771304 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/pull/0.log" Mar 10 10:05:28 crc kubenswrapper[4883]: I0310 10:05:28.787082 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/util/0.log" Mar 10 10:05:28 crc kubenswrapper[4883]: I0310 10:05:28.812374 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/pull/0.log" Mar 10 10:05:28 crc kubenswrapper[4883]: I0310 10:05:28.964673 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/pull/0.log" Mar 10 10:05:28 crc kubenswrapper[4883]: I0310 10:05:28.993055 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/extract/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.003337 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/util/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.132754 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-utilities/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.297574 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-utilities/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.303810 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-content/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.313448 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-content/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.463756 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-utilities/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.477128 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-content/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.789574 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/registry-server/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.865386 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-utilities/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.957186 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-utilities/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.986100 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-content/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.988232 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-content/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.169890 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-utilities/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.179749 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-content/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.384847 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/util/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.539136 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/pull/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.563121 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/util/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.581281 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/pull/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.735637 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/registry-server/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.746200 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/pull/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.783254 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/extract/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.801101 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/util/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.999079 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9d6jf_849aec1a-3ce6-4153-8e52-4bf0185e29e3/marketplace-operator/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.007777 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-utilities/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.140347 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-content/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.158604 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-utilities/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.167585 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-content/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.295774 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-content/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.315317 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-utilities/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.424876 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/registry-server/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.493596 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-utilities/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.642645 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-content/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.655716 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-content/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.656139 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-utilities/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.851943 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-utilities/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.875738 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-content/0.log" Mar 10 10:05:32 crc kubenswrapper[4883]: I0310 10:05:32.326192 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/registry-server/0.log" Mar 10 10:05:37 crc kubenswrapper[4883]: I0310 10:05:37.079935 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:05:37 crc kubenswrapper[4883]: E0310 10:05:37.080737 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:05:48 crc kubenswrapper[4883]: I0310 10:05:48.079860 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:05:48 crc kubenswrapper[4883]: E0310 10:05:48.080830 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.136432 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552286-2j26x"] Mar 10 10:06:00 crc kubenswrapper[4883]: E0310 10:06:00.137397 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b10c19-3e2e-4f4a-812f-bdfaa0415a7e" containerName="oc" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.137413 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b10c19-3e2e-4f4a-812f-bdfaa0415a7e" containerName="oc" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.137626 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="00b10c19-3e2e-4f4a-812f-bdfaa0415a7e" containerName="oc" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.138228 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.140336 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.140811 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.140992 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.147622 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552286-2j26x"] Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.298769 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rsxj\" (UniqueName: \"kubernetes.io/projected/82c92f22-d1bc-4f9e-83b5-8b485ac02a4f-kube-api-access-9rsxj\") pod \"auto-csr-approver-29552286-2j26x\" (UID: \"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f\") " pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.401820 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rsxj\" (UniqueName: \"kubernetes.io/projected/82c92f22-d1bc-4f9e-83b5-8b485ac02a4f-kube-api-access-9rsxj\") pod \"auto-csr-approver-29552286-2j26x\" (UID: \"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f\") " pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.425096 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rsxj\" (UniqueName: \"kubernetes.io/projected/82c92f22-d1bc-4f9e-83b5-8b485ac02a4f-kube-api-access-9rsxj\") pod \"auto-csr-approver-29552286-2j26x\" (UID: \"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f\") " pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.467950 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.892230 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552286-2j26x"] Mar 10 10:06:01 crc kubenswrapper[4883]: I0310 10:06:01.079769 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:06:01 crc kubenswrapper[4883]: E0310 10:06:01.080134 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:06:01 crc kubenswrapper[4883]: I0310 10:06:01.563271 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552286-2j26x" event={"ID":"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f","Type":"ContainerStarted","Data":"189fa60d8ccc036fb6052be568d5e610d11a3aed4e2104c741d8f46e9a41f941"} Mar 10 10:06:02 crc kubenswrapper[4883]: I0310 10:06:02.573024 4883 generic.go:334] "Generic (PLEG): container finished" podID="82c92f22-d1bc-4f9e-83b5-8b485ac02a4f" containerID="22cea35a8715f6aa0a973e315de5d9ddc6700c022d27c07bee0f3733e2e64464" exitCode=0 Mar 10 10:06:02 crc kubenswrapper[4883]: I0310 10:06:02.573101 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552286-2j26x" event={"ID":"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f","Type":"ContainerDied","Data":"22cea35a8715f6aa0a973e315de5d9ddc6700c022d27c07bee0f3733e2e64464"} Mar 10 10:06:02 crc kubenswrapper[4883]: E0310 10:06:02.632992 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82c92f22_d1bc_4f9e_83b5_8b485ac02a4f.slice/crio-conmon-22cea35a8715f6aa0a973e315de5d9ddc6700c022d27c07bee0f3733e2e64464.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82c92f22_d1bc_4f9e_83b5_8b485ac02a4f.slice/crio-22cea35a8715f6aa0a973e315de5d9ddc6700c022d27c07bee0f3733e2e64464.scope\": RecentStats: unable to find data in memory cache]" Mar 10 10:06:03 crc kubenswrapper[4883]: I0310 10:06:03.895563 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:03 crc kubenswrapper[4883]: I0310 10:06:03.985550 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rsxj\" (UniqueName: \"kubernetes.io/projected/82c92f22-d1bc-4f9e-83b5-8b485ac02a4f-kube-api-access-9rsxj\") pod \"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f\" (UID: \"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f\") " Mar 10 10:06:03 crc kubenswrapper[4883]: I0310 10:06:03.991515 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c92f22-d1bc-4f9e-83b5-8b485ac02a4f-kube-api-access-9rsxj" (OuterVolumeSpecName: "kube-api-access-9rsxj") pod "82c92f22-d1bc-4f9e-83b5-8b485ac02a4f" (UID: "82c92f22-d1bc-4f9e-83b5-8b485ac02a4f"). InnerVolumeSpecName "kube-api-access-9rsxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4883]: I0310 10:06:04.089559 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rsxj\" (UniqueName: \"kubernetes.io/projected/82c92f22-d1bc-4f9e-83b5-8b485ac02a4f-kube-api-access-9rsxj\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4883]: I0310 10:06:04.591078 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552286-2j26x" event={"ID":"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f","Type":"ContainerDied","Data":"189fa60d8ccc036fb6052be568d5e610d11a3aed4e2104c741d8f46e9a41f941"} Mar 10 10:06:04 crc kubenswrapper[4883]: I0310 10:06:04.591128 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="189fa60d8ccc036fb6052be568d5e610d11a3aed4e2104c741d8f46e9a41f941" Mar 10 10:06:04 crc kubenswrapper[4883]: I0310 10:06:04.591194 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:04 crc kubenswrapper[4883]: I0310 10:06:04.952651 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552280-8d8wl"] Mar 10 10:06:04 crc kubenswrapper[4883]: I0310 10:06:04.961581 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552280-8d8wl"] Mar 10 10:06:06 crc kubenswrapper[4883]: I0310 10:06:06.090016 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b2dc78-bc43-4cf8-a946-509772bb2522" path="/var/lib/kubelet/pods/f7b2dc78-bc43-4cf8-a946-509772bb2522/volumes" Mar 10 10:06:12 crc kubenswrapper[4883]: I0310 10:06:12.081376 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:06:12 crc kubenswrapper[4883]: E0310 10:06:12.082271 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:06:24 crc kubenswrapper[4883]: I0310 10:06:24.086188 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:06:24 crc kubenswrapper[4883]: E0310 10:06:24.087127 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:06:36 crc kubenswrapper[4883]: I0310 10:06:36.083012 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:06:36 crc kubenswrapper[4883]: E0310 10:06:36.084028 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:06:49 crc kubenswrapper[4883]: I0310 10:06:49.079926 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:06:49 crc kubenswrapper[4883]: E0310 10:06:49.081805 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:07:02 crc kubenswrapper[4883]: I0310 10:07:02.180053 4883 generic.go:334] "Generic (PLEG): container finished" podID="20b00153-6497-4507-8247-81caa30a91bc" containerID="57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4" exitCode=0 Mar 10 10:07:02 crc kubenswrapper[4883]: I0310 10:07:02.180171 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/must-gather-29w7p" event={"ID":"20b00153-6497-4507-8247-81caa30a91bc","Type":"ContainerDied","Data":"57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4"} Mar 10 10:07:02 crc kubenswrapper[4883]: I0310 10:07:02.182059 4883 scope.go:117] "RemoveContainer" containerID="57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4" Mar 10 10:07:02 crc kubenswrapper[4883]: I0310 10:07:02.673884 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8cw7n_must-gather-29w7p_20b00153-6497-4507-8247-81caa30a91bc/gather/0.log" Mar 10 10:07:04 crc kubenswrapper[4883]: I0310 10:07:04.084850 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:07:04 crc kubenswrapper[4883]: E0310 10:07:04.085335 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:07:07 crc kubenswrapper[4883]: I0310 10:07:07.402613 4883 scope.go:117] "RemoveContainer" containerID="b6eef7765d3e9379f71ea8bde62b6fb0aea46e71cd54c2a9f6788a8b9c14680a" Mar 10 10:07:12 crc kubenswrapper[4883]: I0310 10:07:12.798101 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8cw7n/must-gather-29w7p"] Mar 10 10:07:12 crc kubenswrapper[4883]: I0310 10:07:12.798909 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8cw7n/must-gather-29w7p" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="copy" containerID="cri-o://212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183" gracePeriod=2 Mar 10 10:07:12 crc kubenswrapper[4883]: I0310 10:07:12.806039 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8cw7n/must-gather-29w7p"] Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.200919 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8cw7n_must-gather-29w7p_20b00153-6497-4507-8247-81caa30a91bc/copy/0.log" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.201745 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.284597 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8cw7n_must-gather-29w7p_20b00153-6497-4507-8247-81caa30a91bc/copy/0.log" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.285011 4883 generic.go:334] "Generic (PLEG): container finished" podID="20b00153-6497-4507-8247-81caa30a91bc" containerID="212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183" exitCode=143 Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.285082 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.285092 4883 scope.go:117] "RemoveContainer" containerID="212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.288803 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qphvx\" (UniqueName: \"kubernetes.io/projected/20b00153-6497-4507-8247-81caa30a91bc-kube-api-access-qphvx\") pod \"20b00153-6497-4507-8247-81caa30a91bc\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.288913 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20b00153-6497-4507-8247-81caa30a91bc-must-gather-output\") pod \"20b00153-6497-4507-8247-81caa30a91bc\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.295112 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b00153-6497-4507-8247-81caa30a91bc-kube-api-access-qphvx" (OuterVolumeSpecName: "kube-api-access-qphvx") pod "20b00153-6497-4507-8247-81caa30a91bc" (UID: "20b00153-6497-4507-8247-81caa30a91bc"). InnerVolumeSpecName "kube-api-access-qphvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.305405 4883 scope.go:117] "RemoveContainer" containerID="57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.392362 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qphvx\" (UniqueName: \"kubernetes.io/projected/20b00153-6497-4507-8247-81caa30a91bc-kube-api-access-qphvx\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.427354 4883 scope.go:117] "RemoveContainer" containerID="212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183" Mar 10 10:07:13 crc kubenswrapper[4883]: E0310 10:07:13.434082 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183\": container with ID starting with 212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183 not found: ID does not exist" containerID="212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.434145 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183"} err="failed to get container status \"212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183\": rpc error: code = NotFound desc = could not find container \"212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183\": container with ID starting with 212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183 not found: ID does not exist" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.434175 4883 scope.go:117] "RemoveContainer" containerID="57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4" Mar 10 10:07:13 crc kubenswrapper[4883]: E0310 10:07:13.436789 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4\": container with ID starting with 57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4 not found: ID does not exist" containerID="57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.436818 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4"} err="failed to get container status \"57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4\": rpc error: code = NotFound desc = could not find container \"57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4\": container with ID starting with 57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4 not found: ID does not exist" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.463849 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20b00153-6497-4507-8247-81caa30a91bc-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "20b00153-6497-4507-8247-81caa30a91bc" (UID: "20b00153-6497-4507-8247-81caa30a91bc"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.495558 4883 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20b00153-6497-4507-8247-81caa30a91bc-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:14 crc kubenswrapper[4883]: I0310 10:07:14.093965 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b00153-6497-4507-8247-81caa30a91bc" path="/var/lib/kubelet/pods/20b00153-6497-4507-8247-81caa30a91bc/volumes" Mar 10 10:07:19 crc kubenswrapper[4883]: I0310 10:07:19.080825 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:07:19 crc kubenswrapper[4883]: E0310 10:07:19.081440 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:07:30 crc kubenswrapper[4883]: I0310 10:07:30.080428 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:07:30 crc kubenswrapper[4883]: E0310 10:07:30.081187 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:07:44 crc kubenswrapper[4883]: I0310 10:07:44.086969 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:07:44 crc kubenswrapper[4883]: E0310 10:07:44.087873 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:07:57 crc kubenswrapper[4883]: I0310 10:07:57.079867 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:07:57 crc kubenswrapper[4883]: E0310 10:07:57.080608 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.144634 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552288-4tlx2"] Mar 10 10:08:00 crc kubenswrapper[4883]: E0310 10:08:00.145598 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="copy" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.145614 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="copy" Mar 10 10:08:00 crc kubenswrapper[4883]: E0310 10:08:00.145636 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="gather" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.145643 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="gather" Mar 10 10:08:00 crc kubenswrapper[4883]: E0310 10:08:00.145685 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82c92f22-d1bc-4f9e-83b5-8b485ac02a4f" containerName="oc" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.145691 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c92f22-d1bc-4f9e-83b5-8b485ac02a4f" containerName="oc" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.145880 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="gather" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.145901 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="copy" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.145917 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="82c92f22-d1bc-4f9e-83b5-8b485ac02a4f" containerName="oc" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.146720 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.148670 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.148671 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.149138 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.156755 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552288-4tlx2"] Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.191153 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzrdc\" (UniqueName: \"kubernetes.io/projected/b7688fb5-21b4-4c69-be24-7182936f25c6-kube-api-access-zzrdc\") pod \"auto-csr-approver-29552288-4tlx2\" (UID: \"b7688fb5-21b4-4c69-be24-7182936f25c6\") " pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.292906 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzrdc\" (UniqueName: \"kubernetes.io/projected/b7688fb5-21b4-4c69-be24-7182936f25c6-kube-api-access-zzrdc\") pod \"auto-csr-approver-29552288-4tlx2\" (UID: \"b7688fb5-21b4-4c69-be24-7182936f25c6\") " pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.314106 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzrdc\" (UniqueName: \"kubernetes.io/projected/b7688fb5-21b4-4c69-be24-7182936f25c6-kube-api-access-zzrdc\") pod \"auto-csr-approver-29552288-4tlx2\" (UID: \"b7688fb5-21b4-4c69-be24-7182936f25c6\") " pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.471832 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.863456 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552288-4tlx2"] Mar 10 10:08:01 crc kubenswrapper[4883]: I0310 10:08:01.771648 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" event={"ID":"b7688fb5-21b4-4c69-be24-7182936f25c6","Type":"ContainerStarted","Data":"777e00b0fa3a80d69c134549338f80748efa319fd4ca35a74dc05f74fa9f8320"} Mar 10 10:08:02 crc kubenswrapper[4883]: I0310 10:08:02.784557 4883 generic.go:334] "Generic (PLEG): container finished" podID="b7688fb5-21b4-4c69-be24-7182936f25c6" containerID="414d36a21e0b760297ba35c1fb5d7c1ceb38d2bb928b62f2d14e133f97df7d40" exitCode=0 Mar 10 10:08:02 crc kubenswrapper[4883]: I0310 10:08:02.784633 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" event={"ID":"b7688fb5-21b4-4c69-be24-7182936f25c6","Type":"ContainerDied","Data":"414d36a21e0b760297ba35c1fb5d7c1ceb38d2bb928b62f2d14e133f97df7d40"} Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.084983 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.276698 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzrdc\" (UniqueName: \"kubernetes.io/projected/b7688fb5-21b4-4c69-be24-7182936f25c6-kube-api-access-zzrdc\") pod \"b7688fb5-21b4-4c69-be24-7182936f25c6\" (UID: \"b7688fb5-21b4-4c69-be24-7182936f25c6\") " Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.283234 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7688fb5-21b4-4c69-be24-7182936f25c6-kube-api-access-zzrdc" (OuterVolumeSpecName: "kube-api-access-zzrdc") pod "b7688fb5-21b4-4c69-be24-7182936f25c6" (UID: "b7688fb5-21b4-4c69-be24-7182936f25c6"). InnerVolumeSpecName "kube-api-access-zzrdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.379530 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzrdc\" (UniqueName: \"kubernetes.io/projected/b7688fb5-21b4-4c69-be24-7182936f25c6-kube-api-access-zzrdc\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.802948 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" event={"ID":"b7688fb5-21b4-4c69-be24-7182936f25c6","Type":"ContainerDied","Data":"777e00b0fa3a80d69c134549338f80748efa319fd4ca35a74dc05f74fa9f8320"} Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.802993 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="777e00b0fa3a80d69c134549338f80748efa319fd4ca35a74dc05f74fa9f8320" Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.803049 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:05 crc kubenswrapper[4883]: I0310 10:08:05.139770 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552282-pdd7d"] Mar 10 10:08:05 crc kubenswrapper[4883]: I0310 10:08:05.146692 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552282-pdd7d"] Mar 10 10:08:06 crc kubenswrapper[4883]: I0310 10:08:06.092941 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38ca7076-03d6-4598-a451-cf485909b9fc" path="/var/lib/kubelet/pods/38ca7076-03d6-4598-a451-cf485909b9fc/volumes" Mar 10 10:08:07 crc kubenswrapper[4883]: I0310 10:08:07.468101 4883 scope.go:117] "RemoveContainer" containerID="a0e857f4b7d8648de7f831deee2dafbed19a142ef98ff1e54d0826fe04524086" Mar 10 10:08:08 crc kubenswrapper[4883]: I0310 10:08:08.080420 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:08:08 crc kubenswrapper[4883]: E0310 10:08:08.081508 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:08:22 crc kubenswrapper[4883]: I0310 10:08:22.079652 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:08:22 crc kubenswrapper[4883]: E0310 10:08:22.080705 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:08:34 crc kubenswrapper[4883]: I0310 10:08:34.087387 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:08:34 crc kubenswrapper[4883]: E0310 10:08:34.088779 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:08:47 crc kubenswrapper[4883]: I0310 10:08:47.079917 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:08:47 crc kubenswrapper[4883]: E0310 10:08:47.080876 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:08:58 crc kubenswrapper[4883]: I0310 10:08:58.080085 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:08:58 crc kubenswrapper[4883]: E0310 10:08:58.081002 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:09:08 crc kubenswrapper[4883]: I0310 10:09:08.037600 4883 scope.go:117] "RemoveContainer" containerID="3bd8ffbe632d745f21bae1bcada8aafee95f3789b72071a8dc962e253bd69366" Mar 10 10:09:08 crc kubenswrapper[4883]: I0310 10:09:08.059623 4883 scope.go:117] "RemoveContainer" containerID="5f767fd40e7a1f328bfb19f2b7318e01d59fdbb800fcd79f270e6c2ebac7a271" Mar 10 10:09:11 crc kubenswrapper[4883]: I0310 10:09:11.080340 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:09:11 crc kubenswrapper[4883]: E0310 10:09:11.081406 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:09:24 crc kubenswrapper[4883]: I0310 10:09:24.089208 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:09:24 crc kubenswrapper[4883]: I0310 10:09:24.600748 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"b8cc001f4c760929811289dd62bac577b874780c74323643ab5e9dd20562531d"} var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515153766761024466 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015153766762017404 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015153756735016526 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015153756735015476 5ustar corecore